Perspective in
Image-Guided Surgery
This page intentionally left blank
Proceedings of the Scientific Workshop on Medical Robotics, Navigation and Visualization
Perspective
in
Image-Guided Surgery RheinAhrCampus Remagen, Germany
11 - 12 March
edited by
Thorsten M Buzug RheinAhrCampus Remagen, Germany
Tim C Lueth Hurnboldt University of Berlin, Germany
World Scientific N E W JERSEY * LONDON
SINGAPORE
-
SHANGHAI
H O N G KONG
TAIPEI
-
CHENNAI
Published by
World Scientific Publishing Co. Pte. Ltd. 5 Toh Tuck Link, Singapore 596224
USA ofice: Suite 202, 1060 Main Street, River Edge, NJ 07661
UK office: 57 Shelton Street, Covent Garden, London WC2H 9HE
British Library Cataloguing-in-PublicationData A catalogue record for this book is available from the British Library.
Coverpicture: Courtesy of Dr. Frans A. Genitsen, Philips Medical Systems, Best.
PERSPECTIVES IN IMAGE-GUIDED SURGERY Proceedings of the Scientific Workshop on Medical Robotics, Navigation and Visualization
Copyright 0 2004 by World Scientific Publishing Co. Pte. Ltd. All rights reserved. This book, or parts there05 may not be reproduced in any form or by any means, electronic or mechanical, includingphotocopying, recording or any information storage and retrieval system now known or to be invented, without written permission from the Publisher.
For photocopying of material in this volume, please pay a copying fee through the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA. In this case permission to photocopy is not required from the publisher.
ISBN 981-238-872-9
Printed in Singapore by World Scientific Printers ( S ) Pte Ltd
FOREWORD The application of computer-aided planning, navigation and robotics in surgery provides significant advantages over conventional interventions. This is due to today’s sophisticated techniques of patient-data visualization in combination with the flexibility and precision of novel robots. Robotic surgery is going to revolutionize surgical procedures. Augmented with 3D image-guidance technology these tools give finer control over sensitive movements in diseased areas and, therefore, allow for more surgical procedures to be performed using minimally invasive techniques than ever before. Perspectives in Image-Guided Surgery is the proceedings of the scientific workshop on Medical Robotics, Navigation and Visualization (MRNV 2004) held in Remagen, Germany, March 11-12,2004. Medical technology is a key engineering topic at the beginning of the new millennium - especially for those countries with changing age statistics, i.e. growing geriatric populations. In 1998 the US Food and Drug Administration (FDA) published a study* on the major trends in medical engineering for the coming 10 years. In this study, worked out by an expert team of the Centre for Devices and Radiological Health (CDRH), six medical areas were identified. These future mega trends were expected to be 1) Computer-Related Technologies, 2) Molecular medicine, 3) Home- and self-care, 4) Minimally invasive procedures, 5 ) Deviceddrug hybrid products and 6 ) Organ replacemeniYassistdevices.
It is far beyond the scope of the workshop on Medical Robotics, Navigation and Visualization to address all the areas above. However, in addition to these six identified trends the CDRH experts noted a number of hrther patterns having significance for medical device evolution. Among others the study sees costcontainment pressures, technologically-tailored product customization, interest in medical ‘enhancement’ and, as mentioned above, growing geriatric populations as supplementary framework and trend driving issues. *
W. A. Herman, D.E.Marlowe and H. Rudolph, Future Trends in Medical Device Technology: Results ofan Expert Survey, Center for Devices and Radiological Health, FDA, (1998)see: http://www.fda.gov/cdrh/ost/trends/toc.html.
V
vi
The two major trends computer-related technologies (including computeraided diagnosis, intelligent devices, biosensors and robotics) and minimally invasive procedures (including minimally invasive devices, medical imaging, microminiaturized devices, laser diagnostics and therapy as well as robotic surgical devices) actually are key topics in the scientific workshop held at RheinAhrCampus Remagen on March 11-12, 2004. And from the results presented during the workshop we can summarize that we are in line with these identified trends six years after publication of the FDA study. On the workshop a significant number of clinical case studies on imageguided surgical procedures were presented - ranging from bony tumor resection using navigation principles via functional imaging guided neuro-navigation to the image-guided setting of dynamic hip screws. If these are connected with the more engineering oriented presentations of clinical research groups it becomes clear that navigation and image guidance in the OR turned from medical research to a surgical everyday tool. Nevertheless, new and risky treatment scenarios must be modelled and simulated before testing in vivo. At the workshop this was demonstrated for computer-aided suturing, image-guided cryotherapy and radio-frequency ablation. Additionally, prediction and simulation studies for force feed-back systems as well as robot simulation systems were presented. Actually, registration is the basis of all image guided procedures. However, the fact that the mathematical principles of registration are well known does not mean that all practical problems are solved. Especially, the acquisition problem of homologous data in the imaging domain and the operation room is a focus of ongoing work. Here, laser surface scanning and holography based procedures are of great interest since the laser technology promises to reduce the surgeon’s interaction efforts, e.g. in fiducial marker selection during the intervention. Closely related to registration research are calibration tasks and accuracy analysis studies. It becomes apparent that it is important to see the entire registration chain from the preoperative imaging device via the intraoperative OR tracking and navigation device to the surgical tool holder and the tool itself. Particularly for the surgical tool e.g. a biopsy needle L.-P. Nolte pointed out that tool bending must not be neglected. Clinical experts report on overall accuracies in the range between 1 and 3 mm. The FDA study ranked virtual reality technologies as purely educational devices. From today’s point of view it must be admitted that the hype on fully immersive VR systems has not arrived in the operation theatre. However, sensor feed-back systems that are able to augment reality seem to enter the OR. Today nobody believes in fully autonomous surgical robot systems. Besides the legal and patient acceptance problems artificial intelligence systems that
vii
could manage all interventional eventualities are not at hand. However, robots that assist the surgeon in the sense of e.g. intelligent drill speed control are emerging. From the list of contributions it can be seen that MRNV 2004 was a scientific workshop on new image-guided procedures in all medical application areas. The workshop brought together scientific, medical and application experts from university, clinical and commercial sites. The workshop was organized by the RheinAhrCampus Remagen and the CharitC of the Humboldt University of Berlin. The workshop was locally organized by the Center of Expertise for Medical Imaging, Computing and Robotics - CeMicro - in Remagen, Germany. CeMicro is a research center for medical image acquisition technology, medical signal processing and computation, and improvement of clinical procedures especially in image-guided interventions with robotic assistance. The research center is associated with the RheinAhrCampus Remagen on the river Rhine lying close to the Ahr valley, to the Eifel countryside, and to Bonn, the former capital of Germany. A remarkable point about CeMicro at RheinAhrCampus Remagen is its exceptional standard of medical imaging equipment. All modem scanners ranging from magnetic resonance (MR) and computerized tomography (CT including micro-CT) via x-ray fluoroscopy, ultrasound and endoscopic technology up to thermography with recent QWIP technology and videosequence acquisition systems as well as modem 3D-laser scanners are available on campus laboratories. There is no other technology-oriented university of applied sciences in Germany with a comparable range of complex medical imaging systems which have a hll-time accessibility for technology-oriented research projects. Research within CeMicro at RheinAhrCampus Remagen includes activities in the fields of imaging, simulation, navigation and robotics for scenaria in the OR. Remagen, April 2004
Thorsten M. Buzug Workshop Chair MRNV 2004
This page intentionally left blank
ACKNOWLEDGEMENTS As chair of the MRNV 2004 workshop 1 would like to thank the members of the program committee for the selection of works included in these proceedings. The members of the program committee are Nicholas Ayache Jens Bongartz Thorsten M. Buzug Thomas Christaller Olaf Dossel Rudolf Fahlbusch Toshio Fukuda Heinz Handels ULrich Hartmann Stefan Hal3feld David Hawkes Peter Hering Gerd Hirzinger Dietrich Holz Erwin Keeve Ron Kikinis Frithjof Kruggel Heinz U. Lemke Steffen Leonhardt Sven Loncaric Tim C. Lueth Seong K. Mun Wolfgang Niederlag Frank Pasemann Wolf Rathgeber Torsten Reichert Georg Schmitz Jocelyne Troccaz Max Viergever Heinz Worn Gerhard Wahl
INRIA Sophia Antipolis RheinAhrCampus Remagen RheinAhrCampusRemagen FhG AIS, St. Augustin University of Karlsruhe University of Erlangen-Niirnberg University of Nagoya UKE University of Hamburg RheinAhrCampus Remagen University of Heidelberg King’s College University of London University of Dusseldorf DLR Oberpfaffenhofen RheinAhrCampus Remagen Caesar Bonn Harvard Medical School Boston University of Leipzig Technical University of Berlin Helmholtz Institute RWTH Aachen University of Zagreb Humboldt University of Berlin University of Georgetown Dresden-FriedrichstadtGeneral Hospital FhG AIS, St. Augustin Europaische Akademie Bad Neuenahr University of Mainz RheinAhrCampusRemagen University of Grenoble, CNRS, La Tronche University of Utrecht University of Karlsruhe University of Bonn
ix
X
For financial support of the MRNV 2004 workshop I would like to thank the German Federal Government. MRNV 2004 has been assigned as one of the key events of the science festival in the last year of the Bonn-Berlin contract. Further, I would like to thank NDI Europe GmbH - especially Dr. Christian Lappe, TUV Rheinland Group - especially Dr. Meinolf Gerstkamp and the Fraunhofer Institute IPK in Berlin - especially Prof. Dr. Tim Lueth - for supporting the workshop. Warm thanks go to the members of the local organization team: Tobias Bildhauer, Michael Bbttcher, Holger Dorle, Anke Hulster, Marie-Sophie Lafontaine, Birgit Lentz, Kerstin Ludtke-Buzug, Volker Luy, Gisela Niedzwetzki, Waltraud Ott and Dirk Thomsen. Many thanks go to the following cooperating societies: DGBMT (Deutsche Gesellschaft fiir Biomedizinische Technik) of VDE, CURAC (Deutsche Gesellschaft fiir Computer- und Roboterassistierte Chirurgie), FhG - AIS (Fraunhofer Society - Institute of Autonomous Intelligent Systems), Caesar (Center of Advanced European Studies and Research) and the European Academy Bad NeuenahrlAhrweiler. On behalf of all authors I would also like to thank Chelsea Chin (World Scientific Publishing Company, Singapore) for their help in the preparation of these proceedings. Remagen, April 2004
Thorsten M. Buzug Workshop Chair MRNV 2004
CONTENTS Foreword
v
Registration
1
Stereotactic Treatment Planning Using Fused Multi-Modality Imaging K.-D. Hamm, G. Surbel; G. Kleinert, M. Schmiicking, A. Niesen, R. I? Baum, R. Aschenbach and S. Basche
3
Non-Rigid Registration of Intraoperatively Acquired 3D Ultrasound Data of Brain Tumors M. Letteboel; I? Helliel; D. Rueckert, P Willems, J. W Berkelbach and W Niessen
11
Comparison of Different Registration Methods for Navigation in Cranio-maxillofacial Surgery M. Zinsel; R. A. Mischkowski, M. Siesseggel; J. Neugebauel; A. Kiibler and J. E. Ziiller
19
Localisation of Moving Targets for Navigated Radiotherapy L. Vences, 0. Sauel; M. Roth, K. Berlingel; M. Doetter and A. Schweikard
26
Accuracy and Practicability of Laser Surface Scanning for Registration in Image Guided Neurosurgery R. Krishnan, A. Raabe and V Seifert
31
Using the AWIGS System for Preparation of Computer Aided Surgery H. Knoop, J. Raczkowsky, H. Worn, U. wslucha and T. Fiegele
31
Ultra-Fast Holographic Recording and Automatic 3D Scan Matching of Living Human Faces D. Giel, S. Frey, A. Thelen, J. Bongartz, I? Hering, A. Niichtel; H. Surmann, K. Lingemann and J. Hertzberg
43
xi
xii
Automatic Coarse Registration of 3D Surface Data in Oral and Maxillofacial Surgery Z Maiel; M. Benz, N. Schon, E. Nkenke, E W Neukam, E Vogt and G. Hausler
51
Automated Marker Detection for Patient Registration in Image Guided Neurosurgery R. Krishnan, E. Herrmann, R. WolB A. Raabe and V Seifert
59
Advanced Navigation and Motion Racking
67
Clinical Relevance of Preoperative CT-Based Computer Aided 3D-Planning in Hepatobiliary, Pancreatic Surgery and Living Donor Liver Transplantation J. Harms, H. Bourquain, K. Oldhafel; H.-0. Peitgen, J. Hauss and J. Fangmann
69
Analysis of Drill Sound in Spine Surgery I. Boesnach, M . Hahn, J. Moldenhauel; Th. Beth and U. Spetzger
77
Experimental Navigation Setup for Coronary Interventions J. Borgert, S. Kriiegel; R. Grewer and H. Timinger
85
Beating Heart Tracking in Robotic Surgery Using 500 Hz Visual Servoing R. Ginhoux, J. A. Ganglog M. E de Mathelin, L. Solel; M. M. Arenas Sanchez and J. Marescaux
93
Occlusion-Robust, Low-Latency Optical Tracking Using a Modular Scalable System Architecture A. Kopjle, R. Mannel; M. Schill, M. Rautmann, P. I? Pott, M. L. R. Schwarz, H. I? Schalf; A. Wagner; E. Badreddin and P. Weiser
101
Development of Autoclavable Reflective Optical Markers for Navigation Based Surgery D. Schauel; T Kriiger and 7: Lueth
109
xiii
Iso-C 3D Navigated Drilling of Osteochondral Defects of the Talus: A Cadaver Study M. Citak, J. Geerling, D. Kendo$ M. Richtel; 'I: Hiifner and C. Krettek
118
Development of a Navigation System for Transcranial Magnetic Stimulation (TMS) A. Wechsler, S. Woessner and J. Stallkarnp, A. Thielscher and 'I: Karnrner
I22
Fluoroscopy Based Navigated Drilling of Four Osteonecrotic Lesions in One Patient M. Citak, J. Geerling, D. Kendo8 H, Wiibben,C. Krettek and ?: Hufier
129
Iso-C 3D Accuracy-Control and Usefullness at Calcanues Osteosynthesis D. Kendoa J. Geerling, M. Richtel; ?: Hiifiel; M. Citak and C. Krettek
133
Craniofacial Endosseus Implant Positioning with Image-Guided Surgical Navigation J. Hoffmann, D. Troitzsch, C. Westendoa F: Damrnann and S. Reinert
137
A Hybrid Approach to Minimally Invasive Craniomaxillofacial Surgery: Videoendoscopic-AssistedInterventions with Image-Guided Navigation J. Hoffmann, D. Troitzsch, C. Westendoa F: Darnrnann and S. Reinert
144
Geometrical Control Approaches for Minimally Invasive Surgery M. Michelin, E. Dombre and P Poignet
152
Prospective Head Motion Compensation by Updating the Gradients of the MRT C. Dold, E. A. Firle, G. Sakas, M. Zaitsev, 0. Speck, J. Hennig and B. Schwald
160
xiv
Calibration and Accuracy Analysis
169
Non-Invasive Intraoperative Imaging Using Laser Radar Imaging in Hip-Joint Replacement Surgery G. Kompa and G. Kamucha
171
Accuracy in Computer Assisted Implant Dentristry. Image Guided Template Production vs. Burr Tracking G. Widmann, R. Widmann, E. Widmann and R. J. Bale
180
3D-Accuracy Analysis of Fluoroscopic Planning and Navigation of Bone-Drilling Procedures J. A. K. Ohnsorge, M. Weisskopf; E. Schkommodau, J. E. Wildbergel; A. Prescher and C. H. Siebert
187
Accuracy of Fluoroscope and Navigated Controlled Hind- and Midfoot Correction of Deformities - A Feasibilitiy Study J. Geerling, S. Zech, D. KendoB T Hufiel; M. Richter and C. Krettek
193
Navigation Error of Update Navigation System Based on Intraoperative MRI Y . Muragaki, H. Iseki, M. Sugiura, K. Suzukawa, K. Nambu, K. Takakura 7: Maruyama, 0. Kubo and 7: Hori
198
Accuracy Analysis of Vessel Segmentation for a LITT Dosimetry Planning System J. Drexl, V Knappe, H. K. Hahn, K. S. Lehmann, B. Frericks, H. Shin, H.-0. Peitgen
204
Clinical Case Studies
215
Resection of Bony Tumors Within the Pelvis - Hemipelvectomy Using Navigation and Implantation of a New, Custom Made Prosthesis J. Geerling, D. Kendofi L. Bastian, E. Mossingel; M. Richtel; T. Hufier and C. Krettek
217
f-MRI Integrated Neuronavigation - Lesion Proximity to Eloquent Cortex as Predictor for Motor Deficit R. Krishnan, H. Yahya, A. Szelknyi, E. Hattingen, A. Raabe and K Seifert
223
xv
Trends and Perspectives in Computer-Assisted Dental Implantology K. Schicho, R. Ewers, A. Wagner; R. Seemann and G. Wittwer
23 1
The Experience of the Working Group for Computer Assisted Surgery at the University of Cologne R. A. Mischkowski, M. Zinsel; M. Siesseggel; A. Kiibler and J, E. Zoller
236
Fluoroscopic Navigation of the Dynamic Hip Screw (DHS): An Experimental Study D. Kendofi J. Geerling, M. Citak, I: Gosling, I: Hiifier; C. Krettek and M. Kfuri Jr
240
Image-Guided Navigation for Interstitial Laser Treatment of Vascular Malformations in the Head and Neck J. Hojfmann, C. Westendo& D. Troitzsch, U. Ernemann and S. Reinert
244
Minimally Invasive Navigation-Assisted Bone Tumor Excision in the Parietal Skull J. Hojfmann, D. Troitzsch, C. Westendo& E Darnmann and S. Reinert
252
Simulation and Modelling
259
Realistic Haptic Interaction for Computer Simulation of Dental Surgery A. Petersik, B. Pflesser; U. Tiede, K. H. Hohne, M . Heiland and H. Handels
26 1
Robotic Simulation System for Precise Positioning in Medical Applications E. Freund, I;: Heinze and J. Rossmann
270
Computer-Aided Suturing in Laparoscopic Surgery E Nageotte, M. de Mathelin, C. Doignon, L. Soler; J. Leroy and J. Marescaux
278
Experimental Validation of a Force Prediction Algorithm for Robot Assisted Bone-Milling C. Plaskos, A. J. Hodgson and I? Cinquin
286
xvi
SKALPEL-ICT Simulation Kernel Applied to the Planning and Evaluation of Image-Guided Cryotherapy A. Branzan Albu, D. Laurendeau, C. Moisan and D. Rancourt
295
Simulation of Radio-Frequency Ablation Using Composite Finite Element Methods Z Preussel; E Liehl; U. Weikard, M . Rump$ S. Sauter and H.-0. Peitgen
303
An Interactive Planning and Simulation Tool for Maxillo-Facial Surgery G. Berti, J. Fingberg, J. G. Schmidt and Z Hierl
311
Robotic Interventions
319
Principles of Navigation in Surgical Robotics D. Henrich and F! Stolka
321
Robotic Surgery in Neurosurgical Field H. Iseki, I: Muragaki, R. Nakamura, M. Hayashi, 'I: Hori, K. Takakura, S. Ornori, K. Nishizawa and I . Sakuma
330
From the Laboratory to the Operating Room: Usability Testing of LER, The Light Endoscope Robot l? Berkelman, E. Boidard, F! Cinquin and J. Troccaz
338
Robotic and Laser Aided Navigation for Dental Implants I: M. Buzug, U. Hartmann, D. Holz, G. Schmitz, J. Bongartz, F! Hering, M. Ivanenko, G. Wahl and I: Pohl
344
SURGICOBOT: Surgical Gesture Assistance COBOT for Maxillo-Facial Interventions E. Bonneau, E Taha, I! Gravez, S. Lamy
353
Sensor-Based Intra-Operative Navigation for the Robot-Assisted Keyhole Surgery J. Stallkamp
361
Robotized Distraction Device for Soft Tissue Monitoring in Knee Replacement Surgery C. Marmignon, A. Lemniei and F! Cinquin
369
xvii
State of the Art of Surgical Robotics I? I? Pott, A. Kopjle, A. Wagnel; E. Badreddin, R. Miinnel; I? Weiser, H.-I? Scharfand M . L. R. Schwarz
375
Optimisation of the Robot Placement in the Operating Room I? Maillet, I? Poignet and E. Dornbre
383
Safety of Surgical Robots in Clinical Trials W Korb, R. Boesecke, G. Eggers, B. Kotrikova, R. Marmulla, N. O’Sullivan, J. Miihling, S. Hassfeld, D. Engel, ff. Knoop, J. Raczkowsky and H. Worn
39 1
Robotics in Health Care. Interdisciplinary Technology Assessment Including Ethical Reflection M. Decker
397
Sensor Feed-Back Systems
405
Palpation Imaging Using a Haptic System for Virtual Reality Applications in Medicine W Khaled, S. Reichling, 0. E Bruhns, H. Bose, M. Baumann, G. Monkman, S. Egersdorfel; D. Klein, A. Tunayal; H. Freimuth, A. Lorenz, A. Pesavento and H. Ermert
407
In Vivo Study of Forces During Needle Insertions B. Maurin, L. Barbe, B. Bayle, I? Zanne, J. GanglofJ; M. de Mathelin, A. Ganki, L. Soler and A. Forgione
415
Teaching Bone Drilling: 3D Graphical and Haptic Simulation of a Bone Drilling Operation H. Esen. M. Buss and K. Yano
423
Micromachined Silicon 2-Axis Force Sensor for Teleoperated Surgery E Van Meel; D. Esteve, A. Giraud and A. M . Gue
432
Improvement of Computer- and Robot-Assisted Surgery at the Lateral Skull Base by Sensory Feedback D. Malthan, J. Stallkamp, S. Woessnel; E. Schwaderel; E Dammann and M. M. Maassen
44 1
xviii
A Surgical Mechatronic Assistance System with Haptic Interface S. Pieck, I. Gross, I? Knappe and J. Wahrburg
450
Visualization and Augmented Reality
459
Calibration of a Stereo See-Through Head-Mounted Display S. Ghanai, G. Eggers, J. Miihling, R. Marmulla, S. Hassfeld, T. Salb and R. Dillmann
46 1
Combined Tracking System for Augmented Reality Assisted Treatment Device R. Wu, I. Kassim, W S. Bock and N . W Sing
468
Virtual Reality, Augmented Reality and Robotics in Surgical Procedures of the Liver L. Solel; N. Ayache, S. Nicolau, X. Pennec, C. Forest, H. Delingette, D. Mutter and J. Marescaux
476
Functional Mapping of the Cortex - Surface Based Visualization of Functional MRI R. Krishnan, A. Raabe, M. Zimmermann and K Seifert
485
3D-Reconstruction and Visualization of Bone Mineral Density for the Ethmoid Bone C. Kobel; R. Sader and H.-I?Zeilhofer
490
List of Authors
499
Registration
This page intentionally left blank
STEREOTACTIC TREATMENT PLANNING USING FUSED MULTI-MODALITY IMAGING* KLAUS-D. HAMM, GUNNAR SURBER, GABRIELE KLEINERT Department for Stereotactic Neurosurgery and Radiosurgev, Helios Klinikum Erfurt, Germany M. SCHMUCKING, A. NIESEN, R.P. BAUM Clinic for Nuclear Medicine /PET Centre, Zentralklinik Bad Berka, Germany R. ASCHENBACH, ST. BASCHE Institute for Diagnostic Imaging, Helios Klittikum Erfurt, Germany
Purpose: The exact target definition cvcn in case of very small target volumcs is an important prerequisite as well for minimal invasive, stercotactic surgery as for radiosurgery (RS) / stereotactic radiotherapy (SRT) on a high accuracy level. Bcsidcs there arc further contributions to the “overall process accuracy” like imaging modality, image fusion, 3D treatment planning, patient repositioning and mechanical accuracy of the used system. Based on innovative sothvare solutions, image fusion may offer the desired data superposition within a short time. Also metabolic images from PET data sets and, in case of arteriovenous malformations, stereotactic angiographic projections (DSA) might be integrated. Special conditions and advantages of BrainLAB’s fully automatic image fusion are described. Method: In the last 3.5 years 535 patients were treated with stereotactically guided biopsy/puncture or RS / SRT. For each treatment planning procedure a fully automatic image fusion of all interesting image modalities was performed. The planning CT data set (slice thickness 1.25 mm) and for arteriovenous malformations also a stereotactic DSA were acquired using head fixation with stereotactic frame or, in case of stereotactic radiotherapy, a relocatable stereotactic mask. Different sequences of MRI (slice thickness 1-2 mm) and in 17 cases F-18-FDG- or FETPET (3.4 mm) data sets were used without head fixation. Results: The fully automatic image fusion of different MRI, CT and PET data sets could be realized in each patient. Only in few cases it visually seemed to be necessary to correct the fusion manually, with no better results. The precision of the automatic fusion result was depending on the quality of imaging modalities (especially slice thickness) and on patient movement during data acquisition. Fusing thin slices of an enlarged region of interest into another data set was also possible in a goad quality. Target volume outlining could be managed in a much better way using all informations of the fused image data sets. Conclusions: Additional informations provided by PET, CT and different MRI scans enable us to improve target definition for stereotactic or neuronavigated treatment planning. The used automatic image fusion is a very accurate and reliable tool and allowes a fast (about 1-2 min) and precise fusion of all available image data sets depending on their acquisition quality.
* This work is supported by BrainLab AG, Heimstetten, Germany
3
4
1. Introduction Stereotaxy is a well known neurosurgical technique for precise target localization by a Cartesian coordinate system as an important prerequisite for stereotactic surgery and radiation treatment (Fig. 1). Even neuronavigation where instead of a stereotactic frame different fiducials are used is based on this tool.
Figure 1. Stereotactic treatment options - for RS and stereotactic surgery a stereotactic frame is needed, for SRT a frame based stereotactic mask system
In general, image fision quality depends on the used data sets being determined by the performance of the imaging systems resp. the scan protocols. Powerful image fusion systems are capable to superimpose various CT and MRI data sets, in some cases PET, SPECT and fMRl data sets, too (5,6,7,9,13).The respective data sets must be acquired in accordance to the software requirements and transferred lossless to the workstation via LAN or by means of a CD-ROM. Diagnostic imaging plays an important role for diagnostics, treatment and follow-up of relevant neurosurgical, intra-cranial lesions. The today’s MRI and CT scanners enable submillimetric slice widths with digital data transfer to image processing workstations. Also metabolic images fiom PET data sets and, in case of arteriovenous malformations, stereotactic angiography projections (DSA) might be integrated. Thereby the preconditions are made for minimal invasive, neuronavigated operations, virtual simulation of operations and for stereotactic operations and radiation treatments. In these cases an accurate image fusion of all relevant data sets should be available.
5
There are different image fusion algorithms based on a partial or fully automatic way: Object pairs (e.g. contoured ventricle, right + left bulbus oculi) Landmarks (e.g. vessel junctions) Grey level (mutual information algorithm) Marker localization (stereotaxy, marker modules) The choice of the fusion software to be used depends on the accuracy level required for the respective application. For diagnostics and treatment of brain lesions an accuracy of *1 mm is needed (Fig.2).
06/00
06100
(without image fusion!) 05/01 (11mo.afk RS)
(with image hsion!)
05iOl (I Imo.after RS)
Figure 1. Follow up of a brainstem metastasis without (upper fig.) and with (lower fig.) exact image fusion K. I.,fem., 49 ys., PonsMetastasis of a thyroid carcinoma, after total resection 3/99 and radio iodid therapy, 06/00 radiosurgery (RS)
2.
Methods
Within the last 3.5 years 535 patients have been treated with stereotactically guided biopsylpuncture or RS I SRT. In our centre the image resolution has been optimized through 1 mm MRI image slice width with sufficient low contrast resolution, and a CT slice width of 1.25 mm (both with no gap and 5122 matrix). This was resulting in a voxel geometry of 1.25 x 0.75 x 0.75 mm that met the submillimetric accuracy requirements for stereotaxy. Thereby the pre-conditions for a precise image
6
fusion could be considered as accomplished. The data sets were transferred to the planning workstation via LAN. The planning CT (for arteriovenous malformations also a stereotactic DSA) was acquired using head fixation with stereotactic frame. In case of brain tumors foreseen for stereotactic radiotherapy, a special relocatable stereotactic mask was used for the acquisition of the stereotactic planning CT data set. The different MRI sequences and in 17 cases of gliomas also a F-18-FDG- or FETPET (slice thickness: 3.4 mm) were made and fused without head fixation (Fig.2). The externally acquired PET data sets could be transferred to the planning workstation via CD or anonymous e-mail. For each treatment planning procedure a hlly automatic image fusion of all interesting image modalities was performed, visually controlled and if necessary corrected.
@) Figure 2. Image fusion for stereotactic radiotherapy planning (28x2 Gy) Pat.H.W. fem (34), astrocytoma 11- rec. right temporoparietal, three times resected, slight left sided hemiparesis, axial slice with fused F-18-FET-PET, 3.4mm slices (A), MRITlcontrast, I.Omm slices (B),MRI-Flair, 2mm slices (C) and contrast CT, 1.25mm slices (D) used for precise definition of active tumor = target volume (thicker orange and brown line) - isodose plan (E), 90%isodose covered tumor plus 2-3mm margin.
7 3.
Results
The automatic image fusion lasted approx. 1-2 minutes and could be corrected manually upon visual inspection. According to our experiences of more than 1000 fusions the accuracy of the automatic image fusion did not need any correction in most of the cases. Only in few cases it seemed to be necessary to correct the fusion manually after visual evaluation, what was not improving fusion quality in general. Millimetric slices without movement artefacts enabled an accurate automatic image fusion. Even the automatic fusion of partial volumes covering the region of interest (e.g. 20..30 slices with SW=1 mm) worked sufficiently. In some of these cases it became necessary to roughly correlate the considered volumes manually before starting the automatic fusion. If data sets with slice widths between 3 and 5 mm are being fused (manually or automatically), the precision level is cut back. Grosu et a1.(5) published an accuracy of 2.4 mm (stand. dev. 0.5 mm) of BrainLAB’s automatic image fusion using 2 mm MRI slices and 2.4 mm PET slices. The fully automatic image fusion of different MRI, CT and PET series could be realized for each patient. The precision level of the automatic fusion result was depending most of all on the image quality of the used modalities, especially the selected slice thickness and the field homogeneity in case of MRI, but also on the degree of patient movement during data acquisition. In general, target volume outlining could be done more exactly by using all image informations of the ksed data sets.
4.
Discussion
For stereotactic treatment planning the use of image information of all relevant modalities is very important. An accurate image fusion is supporting this intention (2,5,6,8,9,11). For all patients a full MRI data set (high resolution, 172 slices, slice width 1 mm) is fused onto a stereotactic CT data set (high resolution, 128 slices, slice width 1.25 mm) enabling an accurate target point determination resp. target volume definition (10,15,16). This is resulting in a voxel geometry of 1.25 x 0.75 x 0.75 mm that meets the submillimetric accuracy requirements for stereotaxy. In this way the invasive access for a stereotactic operation under bypassing the organs at risk can be planned, simulated and calculated slice by slice on the planning workstation. Neuronavigation offers a safe orientation in cases with the lesion in the precentral brain. Especially for skull base tumours the image informations from MRI and CT add up to a micro-anatomical/-pathological general view. This eases access
8
planning for stereotactic biopsies resp. enables the contouring of the target volume and radiation sensitive organs at risk for stereotactic radiation treatment planning. PET as a metabolic imaging facility can contribute to the differentiation of scarred changes after operations, radiation effects after radiation treatments, surrounding edema and active tumour growth for gliomas (especially WHO grade 11) and should therefore be involved into image fusion (5,6,11,13). For radiosurgery planning of arteriovenous malformations (AVM) angiographic projections in A-P and lateral direction with attached head frame and localizer box are needed (Fig.3).
C
D
Figure 3 : Integration of the stereotactic localized DSA (digital subtract.angiography) to a RS treatment plan of an AVM (arteriovenous malformation, yellow color) A - frontal and B- right lateral projection of the stereotactic DSA, C- stereotactic planning CT, axial slice 1.25mm, D- automatically fused MRI, axial slice 1 mm, with isodose plan - 11 years old girl after acute intracerebral hemorrhage 04/02 with hemiparesis on the left side, incomplete recovery - 08/02 RS with 20 Gy to the isocenter
9
A precise image fusion also offers important advantages for treatment quality control. It allowes precise follow- up studies after RS / SRT (see above Fig. 2), particularly in case of very small lesions and for accurate volumetry (3,17). Routinely CT controls after stereotactic serial biopsies should also be fused to the planning CT. Besides the necessary exclusion of a postpunctural bleeding the congruence of the puncture channel with the planned target point trajectory can be proven (Fig. 4).
B
A
Figure 4: Image fusion for stereotactic biopsy quality control Pat.E.0, male, 61y.- unknown brainstem tumor with hemiparesis on the right side - stereotactic biopsy - histological examination resulted a primary cerebral lymphoma A - planning CT with contrast, axial slices 1.25 mm - the lines are the planned virtual trajectories with target point inside the enhanced tumor B - fused postop.CT without contrast, axial slices 5mm, for quality control - note the small air bubbles next to target point
5.
Conclusions
Image fusion has a high significance for diagnostic imaging of cerebral lesions. There are various software solutions available. For diagnostics and therapy of brain lesions an accuracy level of -+ 1 mm is needed. As a pre-condition, the available CT and MRI scanners must offer thin slice acquisition of DICOM images with high-grade image quality. The automatic image fusion software fiom BrainLAB works reliably and delivers exact comparable image superpositions. Various CT, MRI and PET data sets may be fused without any problem. Thereby the automatic image fusion is a suitable tool fulfilling neurosurgical needs in diagnostics and therapy of intra-cranial lesions and related quality assurance aspects.
10
Using all informations provided by PET, CT and different MRI scans enable us to improve target definition for stereotactic or neuronavigated treatment planning. The image fusion also offers a very precise treatment quality control. References
1. 0. Betti, D. Galmarini and V. Derechinskiy, Stereotact Funct Neurosurg 57: 87(1991) 2. S. Burkhardt, A. Schweikard and R. Burkart, Medical Image Analysis 7: 22 l(2003) 3. A. D’Ambrosio and J. Bruce, Curr Neurol Neurosci Rep 3: 206 (2003) 4. G. Grebe, M. Pfaender and M. Roll, Jnt J Radiat Oncol Biol Phys 51: 1451 (2001) 5. A. Grosu, R. Lachner and N. Wiedenmann, Int J Radiat Oncol Biol Phys 56: 1450 (2003) 6. K. Hamm, Ch. Przetak, G. Kleinert, G. Surber, M. Schmiicking,R.P. Baum, In: Jackel A (ed) Telemedizinfuhrer Deutschland, Ober-Morlen, Ausgabe 2002, pp 162 7. M. Henze, A. Mohammed and H. Schlemmer, Eur J Nucl Med Mol Imaging 29: 1455 (2002) 8. D.L. Hill, P.G. Batchelor,, M. Holden and D.J. Hawkes, Phys Med Biol46: Rl(2001) 9. H.A. Jaradat, W.A. Tome, T.R. McNutt and M.E. Meyerand, Technology in Cancer Research and Treatment 2: l(2003) 10. T. Kapur, E. Grimson, W. Wells, and R. Kikinis, Medical Image Analysis 1: 109(1996) 11. S. Mutic, J.F. Dempsey, W.R. Bosch, D.A. Low, R.E. Drzymala, K.S.C. Chao, S.M. Goddu, P.D. Cutler, and J.A. Purdy, Int J Rad Oncol Biol Phys 51: 255(2001) 12. D. Ross, H. Sandler and J. Balter, J Neurooncol56: 175 (2002) 13. M. Schmucking, R. Baum, C. Przetak, A. Niesen, G. Surber, G. Kleinert, K. Hamm, E. Lopatta, and T. Wendt, Strahlenther Onkol 177 (Sondernr 1): 6(2001) 14. T. Solberg, R. Fogg and M. Selch, Radiosurgery 3: 53 (2000) 15. C. Studholme, D. Hill and D. Hawkes, Medical Image Analysis 1: 163(1996) 16. W. Wells, P. Viola, H. Atsumi, S. Nakajima and R. Kikinis, Medical Image Analysis 1: 35(1996) 17. J. Williams, Stereotact Funct Neurosurg 78: 17 (2002)
NON-RIGID REGISTRATION OF INTRAOPERATIVELY ACQUIRED 3D ULTRASOUND DATA OF BRAIN TUMORS
’,
’,
MARLOES M.J. LETTEBOER I * , PIERRE HELLIER DANIEL RUECKERT PETER W.A. WILLEMS ‘, JAN WILLEM BERKELBACH WIRO J. NIESSEN Image Sciences Institute, University Medical Center, Utrecht, the Netherlands Projet Vista, IRISA/INRIA-CNRS,Rennes, France Department of Computing, Imperial College, London, UK Department of Neurosurgety, University Medical Center, Utrecht, the Netherlands
’
‘,
’
’
During image-guided neurosurgical interventions brain tissue can shift or deform, leading to an inaccurate determination of the tumor location during surgery. To corrcct for the deformations of the tumor and surrounding tissue we use a series of intraoperatively acquired 3D ultrasound volumes. The deformation between the subsequent ultrasound volumes is determined by registering these volumes with a non-rigid registration mcthod. In this article the performanccs of two non-rigid registration algorithms, a B-spline based free-form deformation method and an optical flow based dcformation method, are compared. In the four patients measured, the overlap of tumor tissue betwcen subsequcnt ultrasound volumes was on average 75% after registration based on the image-guided surgery system. After rigid registration the overlap increased to, on average 85%. After non-rigid registration with the B-spline based free-form deformation algorithm the overlap increased to 93%, while after non-rigid registration with the optical flow based deformation method the overlap increased to 91YO.
1.
Introduction
In neurosurgery, determining the position of a tumor during interventions by navigation based on preoperatively acquired (MRI) data is a common approach. In current systems that allow for navigation on preoperative data, it is assumed that the patient’s head and its content behave as a rigid body. Extrinsic features or markers are used to determine the rigid body transformation between the patient and the preoperative data. However, due to a variety of factors, including cerebrospinal fluid drainage, use of diuretics, gravity and tumor resection, brain tissue will shift during surgery with respect to the skull. This induces an error, which affects the overall accuracy of image-guided neurosurgery systems. As a consequence, during tumor surgery the tumor location and shape with respect to the preoperative MR is uncertain. Intraoperative ultrasound has been proposed as a possible solution to this problem. Trobaugh [ 11 introduced the concept of correlating intraoperative *
Corresponding author: Marloes Letteboer, Image Sciences Institute, University Medical Center, Heidelberglaan 100, Room E01.335, 3485 CX Utrccht, thc Netherlands, E-mail:
[email protected]
11
12 ultrasound with preoperative CT or MRI during neuronavigation, by monitoring the position and orientation of the ultrasound transducer with a tracking device. Brain deformation can now be estimated by comparing the preoperative MR to the intraoperative ultrasound data. However, non-rigid MR-ultrasound registration is a difficult problem. In this paper we take a different approach. It is known that a large part of brain shift (before surgery) is caused by opening the dura, so we acquired intraoperative ultrasound images before and after opening the dura and estimated the deformation by non-rigid registration of these datasets. A quantitative investigation of brain deformation using co-registered ultrasound scans on phantom [2,3] and post-mortem pig brain [3-51 has previously been described in literature. In this article we compare the performance of two non-rigid registration algorithms; a B-spline based free-form deformation algorithm [6] and an optical flow deformation algorithm [7] for the registration of ultrasound volumes acquired prior to and after opening the dura in image-guided neurosurgery for four patients.
2.
Materials and Methods
2.1. Data Acquisition
To plan the neurosurgical procedure, images from a 1.5 Tesla MRI scanner were acquired preoperatively. All acquired images were 3D gadolinium-enhanced, TI weighted acquisitions. The matrix size was 256 by 256, with 120 to 150 slices. The voxel size was 1.O x 1.O x 1.1 mm in all cases. During the image-guided neurosurgical procedure free-hand sweeps were made while the ultrasound probe (Aloka SSD-5000 with 7.5 MHz neuroprobe, Tokyo, Japan) was tracked using a Polaris camera, which is part of the neuronavigation workstation. The relative positions of the 2D scans were used to reconstruct a 3D volume, using the software package StackSX [8]. For four patients at least two ultrasound volumes were acquired; one prior to opening the dura and one after opening the dura, but prior to tumor removal. For all patients around 100 B-scans were acquired per dataset, from which a 3D volume was reconstructed. The scans were acquired with a probe penetration depth of six to twelve centimeters. The reconstructed volumes had a matrix size of around 150 x 150 x 100 voxels with a size of 1 .O x 1.O x 1.O mm. 2.2. Registration of Ultrasound Volumes The goal of the registration process is to find the optimal transformation T :( x , y , z ) ~(x',y',z'), which maps all points of the ultrasound image I(x,y,z,t) acquired at time t in the course of surgery, to the ultrasound image I ( x f , y ' , z f , t O )
13
taken prior to opening the dura. In general, the deformation of the brain between these two acquisitions is non-rigid, which means rigid or affine transformations alone are not sufficient to correct for this motion. Therefore we test two different non-rigid registration algorithms for the registration of the 3D ultrasound volumes. In both approaches, the images are first rigidly aligned, using a registration approach based on mutual information as a similarity measure, using six degrees of freedom (translation and rotation). Subsequently, the two non-rigid registration approaches, B-spline based-free form deformation and optical flow based deformation, are applied.
B-Spline basedfreeform deformation The free form deformation algorithm [6] consists of a combined global and local motion model at each point (x,y,z ) , and can be expressed as: T(x,Y7
4 = Tg/o*'d,.( Y ,4 + To,,/
b - 3
4
Y,
(1)
in which the global motion model describes the overall motion of the brain using an affine transformation. The local transformation is based on a free-form deformation (FFD) based on B-splines. The basic idea of FFDs is to deform an object by manipulating the underlying mesh of control points, where a large spacing of control points allows modeling of global non-rigid deformations, while a small spacing of control points allows modeling of highly local non-rigid deformations. To find the optimal transformation, a cost function associated with the global transformation parameters @ , as well as the local transformation parameters @ is introduced. The cost function comprises two competing goals; the first term represents the cost associated with the image similarity, while the second term corresponds to cost associated with the smoothness of the transformation:
where A is the weighting factor, which defines the trade-off between the alignment of the two image volumes and the smoothness of the transformation. For computational efficiency, the optimization proceeds in several stages. During the first stage the affine transformation parameters are optimized, using an interactive multiresolutional search strategy. During the subsequent stages the nonrigid transformation parameters are optimized. In each stage, a simple iterative gradient decent technique is used, which steps in the direction of the gradient vector with a certain step size. The algorithm stops if a local optimum of the cost function has been found.
14
Opticalflow based deformation The optical flow constraint equation [7] assumes that the luminance of a physical point does not vary much between two volumes to register:
where s are voxels of the volume, t is the index of the volume, f is the luminance function, w, is the expected 3D deformation field, Vf(s,t).w, is the spatial gradient of luminance and f,(s,t) is the voxelwise difference between the two volumes. Since this is an underdetermined system, typically a regularization term is added. This regularization term is defined according to the quadratic difference of the deformation field computed between neighbors. Using an energy-based framework the regularization problem may be formulated as the minimization of the following cost function:
where S is the voxel lattice, C is the set of neighborhood pairs with respect to a given neighborhood system V on S((s, r ) E C ts s E V ( r ) ) and a! controls the balance between the two energy terms. The first term is the linear expansion of the luminance conservation equation and represents the interaction between the field and the data.The second term is the smoothness constraint. In order to cope with large displacements, a classical incremental multiresolution procedure has been developed. Furthermore, at each resolution level, a multigrid minimization based on successive partitions of the initial volume is achieved. A grid level is associated to a partition of cubes. At a given grid level, a piecewise affine incremental field is estimated. The resulting field is a rough estimate of the desired solution, and it is used to initialize the next grid level. This hierarchical minimization strategy improves the quality of the convergence rate.
2.3. Validationof Non-Rigid Registration Since we are working with patient data there is no ground truth available to validate the non-rigid registration methods. With an initial visual inspection of difference images a qualitative assessment of the performance of the registration algorithms can be made, but it is difficult to compare the different algorithms. A quantitative assessment of the registration performance, at the position of the tumor, can be made by segmenting the tumor tissue in both the reference and the registered image and calculating the overlap. The overlap is defined as:
where,'F ,V, are the segmented tumor volumes.
3.
Results and Discussion
For four patients, diagnosed with different types of brain tumors, we acquired at least two ultrasound datasets during surgery. At least one scan was made just after craniotomy, before opening the dura, and at least one scan was made after opening the dura, before surgery. This was done by sweeping the ultrasound probe over the brain, while tracking the position of the probe, which made it feasible to reconstruct a volumetric dataset. The typical quality of these images is shown in Figure 1. Since the ultrasound probe was tracked during scanning it is possible to relate the two ultrasound volumes to the MR coordinate system, the patient coordinate system and each other. When comparing the ultrasound volumes acquired after opening the dura to the volumes acquired before opening the dura it can be seen, in all four patients, that the tumor tissue has shifted or deformed. 3.1. Rigid Registration
Since the shifts between ultrasound volumes acquired before and after opening the dura can be quite large (in the order of 1 cm), first a mutual information based rigid registration is performed with six degrees of freedom. After rigidly aligning the tumor tissue there is still some discrepancy between the two volumes, which is not due to a shift but due to deformation. These deformations can only be eliminated by using non-rigid registration methods.
3.2. Non-Rigid Registration Parameters B-spline basedfree-form deformation The free-form deformation algorithm consists of a combined global and local motion Putieni i
Prrlrrrll
3
Figurel: 2D plane of the 3D ultrasound volume, acquired prior to opening the dura, is shown for two patients (right column). The corresponding MR plane is also displayed (left). These two examples show the typical quality of the ultrasound volumes that were acquired during surgery.
16
model. For the global motion model the rigid registration described in the section 3.1 is used. The local motion model is based on free-form deformations, based on Bsplines (as described in section 2.2). In the current implementation of the algorithm the most important parameters to be varied are the similarity measure, the number of control points and the weighting factor (Equation 2 ) . -Similari& measure- Several similarity measures were tested, including mutual information, correlation coefficient and sum of squared differences. We did not notice significant differences between the performances of these similarity measures and chose mutual information for all the experiments. -Weighting factor- The weighting factor weights between two contradicting requirements of the registration; the exact alignment of the two images and the smoothness of the transformation. However, since the B-spline, used to interpolate between control points, have an intrinsic smoothness, the choice of is not critical for sufficiently low resolutions of control point mesh. -Control point spacing- The density of the control point mesh determines the deformation that can be captured by the method. A larger control point spacing allows the user to find large deformations, while a smaller control point spacing allows for smaller deformations. We chose a hierarchical set-up for our experiments. The images are registered with a control point spacing of subsequently 16, 8 and 4
mm. Opticalflow based deformation In the optical flow registration algorithm the most important parameters that have to be set are the parameters for the hierarchical minimization; coarsest resolution level and coarsest grid level, and the parameter , which is the balancing coefficient between data and regularization terms. We tested the dependence of the registration on these parameters. For all parameters there was a broad range of values for which the registration worked equally well. We chose our parameters somewhere in this range (coarsest resolution level = 3, coarsest grid level = 4, = 750). 3.3. Validation of Non-Rigid Registration Figure 2 shows a representative of a visual inspection of difference images. The first conclusion from the visual inspection is that non-rigid registration performs better than rigid or IGS registration (the registration performed on basis of the image guided surgery system) but it is very difficult to compare the non-rigid registration results.
17
Figurc 2: Visual inspection of the quality of the different registration methods. Thc top row arc (from left to right) a planc of the ultrasound volumc acquircd prior to opcning thc dura, and of the volume acquircd aftcr opening the dura, and the difference images aftcr rcgistration with thc image guided surgery system and rigid rcgistration. The bottom row arc thc diffcrcnce imagcs aftcr rcgistration with the frec form dcfonnation mcthod with a control point spacing of 16 mm, 8 mm and 4 mm, and aftcr registration with the optical flow dcformation mcthod.
For quantitative validation we measure the tumor overlap after registration. Hereto the segments are transformed according to the deformation field calculated, so there will be no bias toward one method because of different segmentation accuracy. The results for this overlap measure are displayed in Figure 3. The overlap between tumor segments for registration with the image-guided surgery system is quite low, on average 75%. This overlap improves to 85% after rigid registration. Non-rigid registration further improves this overlap to 90, 92, and 93% for the free-form deformation method with a control point spacing of, respectively, 16, 8 and 4 mm, and to an overlap of 91% for the optical flow deformation method. An important note is that the optical flow method is, on our current implementation, more than 100 times faster than the free-form deformation method at the highest resolution level. However, both methods are currently insufficiently fast for intraoperative use, therefore speed up techniques are required. 4.
Conclusion
In this article we compared two methods for non-rigid registration of 3D ultrasound volumes acquired during neurosurgical interventions. We conclude that both methods significantly improve the alignment of the ultrasound volumes at the position of the tumor. However, for both methods speed up techniques are required to make them sufficiently fast for intraoperative use.
18
Overlap of tumor segments 100 7
1
90 80
8 0,
;:
3
50
’
40
30 20
10 0
Patient 1
Patient 2
Patient 3
Patient 4
Figure 3: Validation of registration algorithms by mcasuring the ovcrlap of tumor scgmcnts aftcr rcgistration
References 1. J.W. Trobaugh, W.D. Richards, K.R. Smith, R.D. Bucholz, “Frameless Stereotactic Ultrasonography: Methods and Applications,” Computerized Medical Imaging and Graphics, Vol. 18, pp. 235-246, 1994. 2. R.M. Comeau, A.F. Sadikot, A. Fenster, T.M. Peters, “Intraoperative Ultrasound for Guidance and Tissue Shift Correction in Image-Guided Neurosurgery,” Medical Physics, Vol. 27, No. 4, pp. 787-800,2000. 3. X. Pennec, P. Cahier, N. Ayache, “Tracking Brain Deformations in Time Sequences of 3D US Images,” Pattern Recognition Letters, Vol. 24, pp. 801-813, 2003. 4. K.E. Lunn, K.D. Paulsen, D.W. Roberts et al., “Displacement Estimation with CoRegistered Ultrasound for Image Guided Neurosurgery: A Quantitative In Vivo Porcine Study,” IEEE Transactions on Medical Imaging, Vol. 22, No. 11, pp. 1358-
1368,2003. 5. I. Pratikakis, C. Barillot, P. Hellier, E. Memin, “Robust Multiscale Deformable Registration of 3D Ultrasound Images,” International Journal of h u g e and Graphics, Vol. 3, No. 4, pp. 547-565,2003. 6. D. Rueckert, L.I. Sonoda, C. Hayes et al., “Nonrigid Registration Using Free-Form Deformations: Application to Breast MR Images,” IEEE Transactions on Medical Imaging, Vol. 18, No. 8, pp 712-721, 1999. 7. P. Hellier, C. Barillot, E. Memin, P. Perez, “Hierarchical Estimation of a Dense Deformation Field for 3D Robust Registration,” IEEE Transactions on Medical Zmuging, Vol. 20, No. 5, pp. 388-402,2001. 8. R.W. Prager, A. Gee, L. Berman, “StradX: Real-Time Acquisition and Visualization of Freehand Three-Dimensional Ultrasound,” Medical Image Analysis, Vol. 3, No. 2, pp. 129-140, 1999.
COMPARISON OF DIFFERENT REGISTRATION METHODS FOR NAVIGATION IN CRANIO-MAXILLOFACIAL SURGERY ZINSER M., MISCHKOWSKI RA, SIESSEGGER M, NEUGEBAUER J, KUBLER A, ZOLLER JE. Department of Cranio-Maillofacial and Plastic Surgery, Universip of Cologne/Germany
1.
Introduction
Within the last years, navigation systems became more and more common in cranio-maxillofacial surgery. These systems offer some features like threedimensional visualisation of the skull on a screen and virtual planing of instrumentation that can be used for education and training [3,4,6,7,8,11,12, 15,161. The need for navigated procedures stems from the demand of higher accuracy. An important factor for this accuracy is the method of registration between the image data and the tracking of the patient and the tool. The correlation between the surgical site and the corresponding image data set in the operating room is the most time-consuming non-operative process for the surgeon [7]. Recent innovations in laser scanning technology provide a potentially useful tool for three-dimensional surface registration in image-guided surgery [2]. BrainLAB, a company producing navigation systems, has been offering a commercial handheld scanner called z-touch which utilizes surface matching of a preoperative scan and computed tomography images. The z-touch registers about 200 surface points on the patient’s face. The precision of the z-touch is reported to range between 1 and 10 mm, which seems to be insufficient for cranio-maxillofacial indications. The reason for the large deviation in accuracy of the z-touch has not yet been discussed. Moreover it is not known whether a shift of the patient’s skin surface of different tension in muscles of expression when performing computed tomography and during preoperative surface scanning may lead to an invalid data set correlation for computer assisted navigation. No phantom or cadaveric study will show the soft tissue shift and elicit whether these shifts cause a significant alteration of the patient’s facial geometry and reduce the accuracy in data set correlations. Therefore it was decided to investigate the difference in accuracy in a clinical compared to an experimental setting. 19
20 Furthermore the purpose of this study is to evaluate the clinical reliability of three different registration methods. The laser technique using Z-touch from the BrainLAB system was compared with two conventional (paired-point) registration tools like headset and skin markers in clinical cranio-maxillofacial procedures as well as in an experimental skull model using image-guided navigation. 2.
Methods
Surface registration and infrared-based navigation have been performed with the z-touch BrainLAB system. Z-touch performs an automatic data set correlation based on the patient’s surface pattern using the periorbital and nasal area. The patient’s surface pattern is generated from the patient’s native computer tomography (CT) data set. Data set correlation is done with just one click on the command button in the BrainLAB system (see figure 1). In contrast to this new laser technique the conventional registration methods, for example using a head-set or skin markers, require a preoperative CT scan with prefabricated and locally determined markers in order to correlate the CT data intraoperatively.
Figure 1 : Target areas of laser surface scanning Intraoperative scanning [z-touch BrainLAB]
In an experimental setting, a stable anthropomorphic skull model with prelabelled markers was scanned and registered with laser surface scanning (ztouch, BrainLAB) as well as external marker-based algorithms (skin markers and head-set). The registration protocol was then repeated 60 times (see figure 2). Registration errors as well as accuracy were then calculated.
21
Figure 2: Experimental setting, skull model, head-set and landmarks
In a clinical setting, totally seventy-two patients with different indications for oral and cranio-maxillofacial surgery were planned for image-guided surgery using the same passive infrared surgical navigation system (VectorVision, BrainLAB) and marker based algorithms (skin-markers or head-set). The best measure to assess the quality of registration and the true application accuracy is the target localizing error, where target represents the surgical field [ 13,171. In detail application accuracy was assessed by placing the tip of the pointer on the landmark showed in the CT image and comparing the position of the tip with the position in reality.
3.
Results
In the experimental protocol registration with head-set shows the most reliable results with deviation less than 1 mm in 74% versus the skin markers in 42% and the laser scanning (z-touch) in 40%. Within 2 mm deviation rate an accuracy of 94% with the head-set, 92 % with the skin-markers and 86% with the z-touch scanning could be achieved (see Table 1). During various clinical procedures involving oral and cranio-maxillofacial surgery, the best results were shown when registrations were taken with the headset. The headset showed a deviation of less than 2mm in 94%, versus skin markers in 80% and laser-scanner (z-touch) in 68% (see Table 2).
22 Table 1: Accuracy in thc clinical sctting (24 paticnts pcr group)
Skin marker
24
80%
Head-set
24
94%
Ztouch
24
68%
Table 2: Accuracy in the experimcntal setting (registration 60x repeated)
Head-set
60x
74 Yo
20 Yo
94 %
Ztouch
60x
40 %
46 %
86 %
23 Furthermore when using the head-set for registration it was technical much easier and faster. In other words we seldom saw a breakdown of the computer and much less software failures in order to complete the registration with the BrainLAB software. Although the preoperative planning required more time, using the head-set device. We see a tremendous save of intraoperative time when using the head-set compared to laser scanning with z-touch in the registration process with the BrainLAB system.
4.
Discussion
The most commonly used method of registration is the paired-point method with artificial fiducials. A series of fiducial-based corresponding points is identified in both image space and physical space, and the computer determines the transformation between the image and the physical space. Although boneanchored fiducial markers (screws) provide the most reliable and accurate method for surgical registration, adhesive-mounted skin markers (skin-marker or head-set) are the method of choice because they do not require an invasive procedure. With this method, an average application accuracy of 2 to 7 mm can be attained [ 1,5,10,13,14]. In most comparative studies, paired-point skin fiducial registration was more accurate than paired-point landmark registration or surface registration. However, this method is associated with additional cost and requires time and resources [2]. One alternative to paired-point registration, in which corresponding points are matched, surface-based registration attempts to align the contour of a physical surface with the corresponding image surface. In most studies, surface registration has been shown to be less accurate and less reliable than fiducial registration [2,9,14]. There are two crucial points to successful surface registration with the ztouch method. First, it is extremely important to avoid any skin movement in the scanning target areas, i.e., around the eyes, forehead, nasion and zygoma. It is mandatory to remove any adhesive material in this region before scanning. Furthermore, laser scanning should be confined to areas where skin is thin and closely follows the bony relief. Second, it is of paramount importance to use high-quality images. Use of retrospective acquired images is attractive and may be an important economic consideration. Therefore the images must be high quality, with a high resolution matrix (256x256), and in thin slices (1-2 mm).
24
5.
Conclusion
Although our results show a much better accuracy when using the head-set compare to the registration with the Z-touch, laser scanning is a very interesting technique with tremendous benefits (low radiation load, fast, native images can be used). We think when using better technical devices, for example advanced laser and better software, surface registration is a very interesting and useful method for cranio-maxillofacial surgery in the future.
References 1. Alp MS, Duvovny M, Misra M, Charbel FT, Ausman JI: Head registration techniques for image-guided surgery. Neurol Res 20:3 1-37 (1998) 2. Bucholz R, Macneil W, Fewings P, Ravindra A, MC Durmont L, Baumann C: Automated rejection of contaminated surface measurements for improved surface registration in image guided neurosurgery. Stud Health Techno1Inform 70:39-45 (2000) 3. Cutting C: Diskussion on Fialkov JA et al. A stereotactic system for guiding complex craniofacial reconstruction. Plast Reconstr Surg 89: 346348 (1992) 4. Cutting C, Taylor R, Khorramabadi D, Haddad B, Mc Carthy JG: A virtual reality approach to intraoperative bone fragment positioning during craniofacial surgical procedures. J Craniofac Surg 6: 33-37 (1995) 5. Golfinos JG, Fitzpatrick BC, Smith LR, Spetzler RJ: Clinical use of a frameless stereotactic arm: Results of 325 cases. J Neurosurg 83: 197-205 (1995) 6. Hassfeld S, Muhling J, Zoller J: Intraoperative navigation in oral and maxillofacial surgery. Int J Oral Maxillofac Surg 24: 111-119 (1995) 7. Hassfeld S , Muhling J, Wirtz CR, Knauth M, Lutze T, Schulz HJ: Intraoperative guidance in maxillofacial and craniofacial surgery. Proc Inst Mech Eng H 2 11: 277-283 (1997a) 8. Hassfeld S, Rackowsky J, Bohner P, Hofele C, Holler C, Muhling J, Rembold U: Robotik in der Mund-, Kiefer- und Gesichtschirurgie. Moglichkeiten-Chancen-Risiken.Mund Kiefer GesichtsChir 1: 3 16-323 (1997b) 9. Helm PA, Eckel TS: Accuracy of registration methods in frameless stereotaxis. Comput Aided Surg 3:5 1-56 (1998) 10. Hirschberg H, Kirkeby OJ: Interactive image directed neurosurgery: Patient registration employing the Laitinen stereo-adapter. Minim Invasive Neurosurg 39:105-107 (1996) 11. Marmulla R, Niederdellmann H: Computer assisted bone segment navigation. J Cranio-Maxillofac Surg 26:347-359 (1998) 12. Marmulla R, Hassfeld S, Luth T, Muhling J: Laser-scan-based navigation in cranio-maxillofacial surgery. J Craniofac Surg 3 1: 267-277 (2003)
25
13. Maurer CR Jr, Mc Crory JJ, Fitzpatrick JM: Estimation of accuracy in localizing externally attached markers in multimodal volume head images, in Loew MH (ed): Medical Imaging 1993: Image Processing. Bellingham, SPIE Press, 43-54 (1993) 14. Sipos EP, Tebo SA, Zinreich SJ, Long DM, Brem H: In vivo accuracy testing and clinical experience with the ISG Viewing Wand. Neurosurgery 39: 194-202 (1996) 15. Schramm A, Gellrich NC, Schimming R, Schmelzeisen R: Rechnergestutzte Insertion von Zygomaticumimplantaten nach ablativer Tumorchirurgie. Mund Kiefer GesichtsChir 4: 292-295 (2000) 16. Watzinger F, Birkenfeller W, Wanschitz F, Mellesi W, Schopper C , Sinko K, Huber K, Bergmann H, Ewers R: Positioning of dental implants using computer aided navigation and an optical tracking system: case report and presentation of a new method. J Cranio-Maxillofac Surg 27: 77-8 1 (1999) 17. West JB, Fitzpatrick JM, Toms SA, Maurer CR Jr, Maciunas RJ: Fiducial point placement and the accuracy of point-based, rigid body registration. Neurosurgery 48:8 10-8 17 (2001)
LOCALISATION OF MOVING TARGETS FOR NAVIGATED RADIOTHERAPY* LUCIA VENCESt, OTTO SAUER Klinikfur Strahlentherapie,Universitat Wiirzbug,Josef-Schneider-Str. 4 Wiirzburg, 97080, Germany MICHAEL ROTH, KAJETAN BERLTNGER’, MANFRED DOETTER AND ACHIM SCHWEIKARD Medical Applications Research Group, Technische Universitat Miinchen, Bolzmannstr.3, Garching bei Miinchen, 86748, Germany
For an effective radiotherapy trcatmcnt with external beams it is essential to know the tumour’s position. To accurately localise a tumour interfractional and intrafractional organ movement have to be takcn into account. With a global rigid registration of the planning and thc treatmcnt CT data, the target volume and the actual isocentre may be found in the treatmcnt data sct. However, this approach may not be precise cnough. To improve the accuracy, a local rcgistration of a volume of interest (VOI) was performcd. At the Clinic for Radiotherapy of the University Wiirzburg, scvcral tools were programmed in order to create a localisation s o h a r c for clinical use. Among thcm: retrieve images from different DICOM servers, a GUI for the VOI definition, the local registration capability with output of the target’s translation and rotation relatively to an external fiducial system. We carried out 180 t a t s with our application utilising eleven pairs of CT-CT volumes. We found that in 75% of the cases, the local registration covers the region of interest better (62%) or at least as good as the global registration (1 3%).
1.
Introduction
An important issue for an effective treatment of cancer with external beam radiotherapy is the accurate localisation of the tumour. Because a treatment consists of several sessions and the whole course of treatment course may take some weeks, there are many factors that may make it difficult to locate a tumour. For instance, the patient’s position on the therapy couch may not be identical in each session. Also, loss of weight, different filling of bladder or rectum, or respiratory movement can displace tumours laying in the lung or abdomen. Therefore, interfractional and intrafractional organ movement have to be taken into account to follow the tumour’s position [l-51. * This work is supported by Deutsche Forschunasaemeinschaft and is part of
the Program 1124 “Medizinische Navigation und Robotik”. Work supported by Deutsche Forschungsgemeinschaft.
26
27 Interfractional movement can be tracked using CT data of a patient. Prior to a treatment the target is delineated in a 3D patient model obtained from a CT. The beams and the isocentre are planned on the basis of this planning CT. Before a treatment is executed a new CT is achieved with a Philips Tomoscan located in the therapy room. This is to capture patient’s actual position. By means of a registration of the planning and the treatment CTs the tumour’s position can be verified and appropriate adjustments for the radiation therapy can be taken. In this paper, we present a local rigid registration application to identfy interfractional movement. As part of a project for navigated radiotherapy of the Clinic for Radiotherapy of the University Wiirzburg and the Medical Applications Research Group of the Technical University in Munich this application was programmed in order to make a localisation software for clinical use. 2.
Related work
The purpose of image registration is to find the spatial correspondence of two images. This correspondence is such that it maps the coordinates of an image to the coordinates of the other. Image registration is a central research matter and there have been several reviews on this topic, among them [6, 71. Van Herk et al. [8] proposed a method where the user has to manually mark some organ contours in order to get a rigid registration describing prostate motion. The method we present in the next paragraphs also uses rigid transformations. It requires user intervention to mark a region of interest in a plan volume. As output the program localises this region of interest in a therapy dataset, returns placement of the region of interest between the two volumes and also returns transformation describing this motion 3.
Method
3.1. Global Registration In a first step, a global rigid registration is carried out. The planning CT image is retrieved from a DICOM server and the treatment CT from the Tomoscan in the therapy room. The application receives as input the planning and the treatment volumes and starts the registration. A rigid registration aims to find a rigid body transformation consisting of translations and rotations between two images. There are several advantages of using a rigid body transformation. Such a transformation has only six degrees of freedom, therefore the registration can be solved by a modem PC in a short time. The search of an appropriate
28 transformation between both volumes can be driven by similarity measures dependent only on image information, like mutual information [6].This means, that there is no need for additional information, like user defined landmarks or segmentation and the process is carried out automatically. However, there are some disadvantages of this model. A global rigid body registration can not provide an accurate solution for tumour motion. The distance between any two points after applying a rigid transformation is preserved. Although, this is the case for anatomical structures like bones, this in not necessarily true for more flexible body parts, like intern organs, soft tissue or the spine. These cases can not be modelled with a global rigid registration. 3.2. Local registration To improve the accuracy of the global registration in our software, a local registration is carried out in a second step. The planning CT may have marked a region of interest (ROI) around the tumour or it can be provided by the user with the mouse. Using the global transfornation, a “search region” is defined in the treatment CT to be centred in the mapped coordinates of the ROI’s centre and to have twice the size of the ROI. The ROI and the search region are registered again with a rigid transformation and the result is used to calculate the coordinates of the target. This local registration provides a better localisation of the target. Because both volumes are from the same patient and were acquired within few days, big changes in shape and tumour’s position are not really expected. Thus small local movements can be captured by the local registration. The coordinates of the target can be calculated with the resulting local transformation. The patient lying on the treatment couch may have skin markers, which are easily identified in the treatment volume. These markers establish a coordinate system. The program can calculate the coordinates of the target and isocentre in terms of this coordinate system. 4.
Results
We carried out 180 tests with our application with eleven pairs of CT-CT volumes. These volumes correspond to the thorax, neck, head and pelvis. In most of the cases the local registration seems to deliver an improved result by visual inspection of the global registration alone. A quantitative comparison of both methods was also carried out. To have a quantitative performance measure, we examined the differences between the region of interest in the plan volume and the two resulting images from the
29 therapy volume. In this way, we compared how well a resulting image covered the original region of interest. In our tests, we found that in 75% of the cases, the local registration covers better the region of interest (62%) or at least as good as the global registration (13%). The other 25% corresponds to body parts with higher flexibility, like the spine. In these cases both the local and global registration performed poor, because such movements can not be captured by a rigid body model.
5.
Discussion
For body parts where local deformations may take place, non-rigid registration may deliver better results [9, lo], but at a more expensive time complexity. The program is a valuable tool in the radiotherapy praxis. The required CT images can be retrieved from different DICOM servers. The user can select with the mouse a region of interest in the planning volume or the program reads it from a file. With this information the program finds automatically the region of interest in the treatment volume, it gets the tumour’s current coordinates and provides the transformation used for the registration. If the result is not satisfactory, the user can mark with the mouse the target in the treatment volume and the program calculates its coordinates. In our tests, the local registration needed 20 to 60 seconds (depending on ROl’s size) on a PC Pentium 4 with 5 12MB RAM running at 2.4 GHz.
6.
Conclusion
We have presented an application for tumour localization. This application calculates tumour’s coordinates when the patient lies on the treatment couch. After an automatic initialization the user provides a ROI in the plan data. The new coordinates of ROI in the therapy volume are calculated. Because this is done in few seconds, this program is appropriate for user interaction. For some body parts, big changes in shape and position are not expected, because both datasets are made within few days. Although nonrigid models could provide more accurate coordinates in more flexible body parts, like neck, they are also more time consuming. In such situations, the user has the option of manually mark the target, and the program calculates its coordinates. Therefore the application using a local rigid registration method provides an acceptable compromise between time and accuracy. The registration and DICOM files retrieving applications are part of a joint project for navigated radiotherapy of the Clinic for Radiotherapy of the University Wurzburg and the Medical Research Group of the Technical University in Munich. They form the basis for hture developments, treating
30 intrafractional organ movement. The trajectory of the target within the respiratory cycle can be found by means of local registration from dynamic CT scans. Correlating the data with a real time signal of the respiratory cycle, the control of a treatment machine will be possible 7.
Acknowledgments
This work is supported by Deutsche Forschungsgemeinschaft. The authors thank Kurt Baier, Klaus Bratengeier, Jurgen Meyer and Jorn Wulf for valuable discussions. References
1. E.C. Ford et al. Cone-beam CT with megavoltage beams and an amorphous silicon electronic portal imaging device: Potential for verification of radiotherapy of lung cancer Med. Phys. 29,12 (2002) 2. G. Soete et al. Initial Clinical experience with infrared-reflecting skin markers in the positioning of patients treated by conformal radiotherapy for prostate cancer. Int. J. Radiation Oncology Biol. Phys., 52, 3 (2002) 3. W.G. O’Dell et al. Dose broadening due to target position variability during fractionated breath-held radiation therapy. Med. Phys. 29, 7 (2002) 4. S. E. Erridge et al. Portal imaging to assess set-up errors, tumor motion and tumor shrinkage during conformal radiotherapy of non-small cell lung cancer Radiotherapy & Oncology 66 (2003) 5 . E.C. Ford, G.S. Mageras, E. Yorke, and C.C. Ling, Respiration-correlated spiral CT: A method of measuring respiratory induced anatomic motion for radiation treatment planning Med. Phys. 30,2 (2003) 6. J. V. Hajnal et al. Medical Image Registration. The Biomedical Engineering Series CRC Press (2001) 7. JBA Maintz and MA Viergever A survey of medical image registration Med. Image Anal., 2 1 (1998) 8. van Herk et al. Quattification of organ motion during conformal radiotherapy of the prostate by three dimensional image registration. Int. J. Radiation Oncology Biol.Phys. 30 5 (1995) 9. H. Lester et al. Non-linear registration with the variable viscosity fluid algorithm IPMI’99 (1999) 10. J.A. Little et al. Deformations incorporating rigid structures Cornp. Vision and Image Understanding 66 2 (1997)
ACCURACY AND PRACTICABILITY OF LASER SURFACE SCANNING FOR REGISTRATION IN IMAGE GUIDED NEUROSURGERY RENE KRISHNAN, ANDREAS RAABE AND VOLKER SEIFERT Department of Neurosurgery, Johann Wolfgang Goethe-University, Schleusenweg 2-16, 60528 Frankfirthfain, Germany
Placing multiple external fiducial markers for patient registration in image guided ncurosurgcry has somc major disadvantages. A tcchniquc avoiding these markcrs would bc attractivc. We report our clinical experience with a new lascr scanning-based tcchniquc of surface registration. The purpose of this study was to prospectively measurc both the calculated registration error and the application accuracy using laser surface registration for intracranial image guided surgery in a routine clinical setting. 180 consccutivc paticnts with diffcrcnt intracranial pathologics were schcdulcd for intracranial image guided surgcry utilizing a passive infrared surgical navigation system (2-touch, BrainLAB, Heimstcttcn, Germany). The first 34 consecutive patients were registered both with laser and marker based techniques. Surfacc registration was performed using a class I laser device that emits a visible laser beam. The polaris camcra system detects the skin reflections of the laser which the software uses to generate a virtual 3D matrix of the individual anatomy of the patient. An advanced surface-matching algorithm then matches this virtual 3D matrix to the 3-dimensional MRT data sct. Application accuracy was assessed using the localization error for three distant anatomical landmarks. Laser surface registration was successful in 174 paticnts. The registration for 6 patients failed due to mismatch of the registcrcd and calculated surfaces ( n 4 ) and technical problems (n=2). In the 34 patients registered with both techniques, thc application accuracy for the surgical field was 2.4 f 1.7 mm (range 1-9 mm). Application accuracy was higher for frontally located lesions (mean 1.8 f 0.8 mm, n=13) compared to temporal, parietal, occipital or infratentorial lesions (mean 2.8 2.1 mm, n=21). The true application accuracy was not correlated to the calculated accuracy, returned from the system after registration. In this clinical study laser scanning for surface registration was an accurate, robust and easy to use method of patient registration for image guided surgery. We now use this registration method in our daily routine.
*
1.
Introduction
Neuronavigation became an increasingly important part of planning and performing intracranial surgery, as it allows neurosurgery to be less invasive and more effective [ 1,2,3,4]. Establishing a mathematical relationship that maps the acquired images (MRI, CT) space coordinates to the physical space coordinates of the patient is called registration. Although the most commonly used method of registration is the paired-point registration with adhesive-mounted external fiducials, it has major drawbacks as it requires time, resources, and has the
31
32
potential risk of marker displacement or loss. Unlike point-based registration, in which corresponding points are matched, surface-based registration attempts to align the contour of a physical surface with the corresponding image surface. Techniques and algorithms are complex. For clinical use, however, surface registration is attractive because it eliminates the problems and costs of fiducial based registration. In this study, we report our clinical experience with the laser scanning-based technique of surface registration. The objective was to use laser surface registration for routine image-guided surgery, to measure calculated registration accuracy and application accuracy, and to assess the handling of software and hardware when used in the clinical setting. 2.
Patients and Methods
This prospective study included 34 consecutive patients with different intracranial diseases (13 gliomas, 7 meningiomas, 5 cavernous malformations, 5 metastases, and 4 miscellaneous conditions) who were scheduled for imageguided surgery. Lesion location was frontal in 13 cases, parietal in 10 cases, temporal in 6 cases, occipital in 3 cases, and infratentorial in 2 cases. There were 16 female and 18 male patients ranging in age from 1 to 70 years (mean, 47 yr). We have used this registration technique in over 250 patients so far. Image-guided surgery was performed by using a passive infrared surgical navigation system (VectorVision2,BrainLAB, Heimstetten, Germany) [5,6]. All surgical procedures were performed with the patient under general anesthesia. The head was immobilized by use of a Mayfield clamp. Once the patient was positioned appropriately for the surgical procedure, the position was registered to establish a spatial correspondence between the patient's head and the acquired images. A rigid mechanical connection was established between the Mayfield clamp and the reference star. The reference star was positioned outside the surgeon's working space in such a manner as to maintain visibility for the cameras during the surgical procedure. After registration and surgical replanning, the operative procedure was performed by use of image injection and microscope or pointer navigation. The z-touch laser device (BrainLAB) is a commercially available noncontact digitizer for matching the coordinate systems of the surgical field and the three-dimensional (3-D) imaging data. It is a Class I laser device that emits a visible laser beam, which appears as a red point on the skin ofthe patient for planning and guidance. The laser beam is visible to the cameras of the VectorVision image-guided surgery system. The Polaris camera system (Northern Digital, Waterloo, ON, Canada) detects the skin reflections of the laser, which the software uses to generate a virtual 3-D matrix of the individual
33
anatomy of the patient. An advanced surface-matching algorithm then matches this virtual 3-D matrix to the 3-D computed tomographic/magnetic resonance imaging therapy data set. Typical target areas for laser surface scanning are the nasion, forehead, medial, superior, and lateral rim of the orbita. In these areas, there is no hair and the skin is typically thin and follows contour-forming bone prominences, which form an individual surface relief. The eye, eyelid, and eyebrow should be spared. Care should be taken to apply the pins of the Mayfield clamp in a manner that avoids skin movement. Because the target areas for collecting surface points with the laser are situated mainly frontally or frontolaterally and the points are nonsurrounding, there may be increasing error in the more occipital regions. Therefore, at the completion of laser scanning, a set of optional points ofthe parietal or occipital scalp surface can beacquired by use of the pointer. The function of this feature is to improve the application accuracy when the lesion is situated in the cerebellum or the parietal or occipital lobe. In the calculation process, two surface representations are aligned, one constructed from the preoperative images and one from the surface points collected during the laser scanning of the patient's head. The surface-matching algorithm partially uses the iterative closest point algorithm. After the calculations are completed, a measure of goodness of fit of registration is displayed, which represents the calculated registration error (CRE) of the surface-matching procedure. The best measure to assess the quality of registration and the "true" application accuracy is the target localizing error (LE) [7,8], where "target" represents the surgical field. We have used the LE of three landmarks that were not involved in the registration procedure. In detail, application accuracy (reflected by LE) was assessed by placing the pointer in the physical space (patient) at three distant and surrounding anatomic landmarks and measuring the distance to the image space for: 1) the anterior border of the external auditory canal left side, 2) the anterior border of the external auditory canal right side, and 3) the nasion. Mean LE was expressed as mean value f standard deviation. To take into account the importance of application accuracy at the site of approach and dissection, the LE of the landmark closest to the surgical field was given in addition to the LE of all landmarks.
3.
Results
Handling of software and hardware in the clinical setting was easy and uncomplicated. Time required for laser surface registration ranged between 3 and 8 minutes. There was no technical failure. For the landmark closest to the surgical field, LE was 2.4 f 1.7 mm (range, 1-9 mm). Application accuracy for
34
laser surface scanning was higher for the surgical field of frontally located lesions (mean LE, 1.8 0.8 mm; range, 1-4 mm; n = 13) as compared with temporal, parietal, occipital, or infratentorial lesions (mean LE, 2.8 2.1 mm; range, 1-9; n = 21). The CRE ranged from 0.5 to 2.0 mm (mean, 1.13 0.37 mm) with laser surface registration. This accuracy has been confirmed by other researchers [9]. A mean of 100 surface points was collected for registration in each procedure. There was no correlation between the CRE and the LE of any landmark. This finding suggests that the measure of CRE has a limited practical significance.
*
4.
*
*
Discussion
Registration after laser surface scanning, as used in our study, is based mainly on surface data acquired in the frontal region. As mentioned above, a non surrounding array of points may cause increasing LE in distant surgical areas. To prevent increasing LE, we used a method developed to allow for the acquisition of additional surface points by use of the pointer. We acquired two to eight additional surface points distant from the frontal region in all of our patients with temporal, parietal, occipital, and infratentorial lesions. Nonetheless, we found a slightly higher application accuracy for frontally located surgical fields as compared with more posteriorly located surgical areas (LE, 1.8 0.8 mm versus 2.8 f 2.1 mm). However, the application accuracy for temporal, parietal, occipital, and infratentorial lesions still was equivalent with or exceeded most of the accuracy values for fiducial registration reported in the literature [10,5,11,12]. Nevertheless, one should be aware of a potential LE when the surgical field is occipital or infra-tentorial. There are two crucial points to successful surface registration with the ztouch method. First, it is extremely important to avoid any skin movement in the scanning target areas, i.e., around the eyes, forehead, nasion, and zygoma. It is mandatory to remove any adhesive material in this region before scanning. Furthermore, laser scanning should be confined to areas where skin is thin and closely follows the bony relief. Second, it is of paramount importance to use high-quality images. Use of retrospectively acquired images is attractive and may be an important economic consideration. However, retrospective images currently are obtained and collected in highly variable ways, and they may introduce unpredictable sources of localization error. Therefore, when it is planned that retrospectively acquired images will be used for laser surface registration, the images must meet the needs of navigation: they should be of high quality, with a high-resolution matrix (256 x 256), and in thin slices (1-2 mm). With slices thicker than 2 mm,
*
35 there is an unacceptable increase in target LE and a higher failure rate of surface matching. Thus, we do not recommend the use of magnetic resonance imaging scans with slices thicker than 2 mm. 5.
Conclusion
In more than 250 patients registered, this method proofed to be an accurate, robust, and easy-to-use method of patient registration for image-guided surgery, Laser surface based registration potentially eliminates some practical drawbacks of adhesive-mounted skin fiducials. The independence of the operation time from data acquisition, as no markers have to be kept in place, is an advantage that can not be underestimated in clinical practice. References 1. Barnett,G.H., Miller,D.W., Weisenberger,J., Frameless stereotaxy with scalp-applied fiducial markers for brain biopsy procedures: experience in 218 cases, J. Neurosurg. (1999) 569- 576 2. Germano,I.M., Villalobos,H., Silvers,A., Post,K.D., Clinical use of the optical digitizer for intracranial neuronavigation, Neurosurgery (1 999) 26 1269 3. Haase,J., Image-guided neurosurgeryheuronavigatiodthe SurgiScope-reflexions on a theme, Minim. Invasive. Neurosurg. (1999) 53- 59 4. Kelly,P.J., Stereotactic surgery: what is past is prologue, Neurosurgery (2000) 16- 27 5. Gumprecht,H.K., Widenka,D.C., Lumenta,C.B., BrainLab VectorVision Neuronavigation System: technology and clinical experiences in 131 cases, Neurosurgery (1999) 97- 104 6. Muacevic,A., Uhl,E., Steiger,H.J., Reulen,H.J., Accuracy and clinical applicability of a passive marker based frameless neuronavigation system, J. Clin. Neurosci. (2000) 414- 418 7. Maurer,C.R., Jr., Fitzpatrick,J.M., Wang,M.Y. et al, Registration of head volume images using implantable fiducial markers, IEEE Trans. Med. Imaging (1997) 447- 462 8. West,J.B., Fitzpatrick,J.M., Toms,S.A., Maurer,C.R., Jr., Maciunas,R.J., Fiducial point placement and the accuracy of point-based, rigid body registration, Neurosurgery (2001) 810- 8 16 9. Schlaier,J., Warnat,J., Brawanski,A., Registration accuracy and practicability of laser-directed surface matching, Comput. Aided Surg. (2002) 284- 290 10. Alp,M.S., Dujovny,M., Misra,M., Charbel,F.T., Ausman,J.I., Head registration techniques for image-guided surgery, Neurol. Res. (1998) 3 137
36 11. Helm,P.A., Eckel,T.S., Accuracy of registration methods in frameless stereotaxis, Comput. Aided Surg. (1 998) 5 1- 56 12. Sipos,E.P., Tebo,S.A., Zinreich,S.J., Long,D.M., Brem,H., In vivo accuracy testing and clinical experience with the 1SG Viewing Wand, Neurosurgery (1996) 194- 202
USING THE AWIGS SYSTEM FOR PREPARATION OF COMPUTER AIDED SURGERY H. KNOOP, J. RACZKOWSKY, H. WORN Universiat Karlsruhe (TH) , Institute for Process Control and Robotics (IPR), Engler-Bunte-Ring 8,D-76131 Karlsruhe, Germany
U. WYSLUCHA MAQUET GmbH & Co. KG, Kehler StraJe 31, 0-76437 Rastatt, Germany T. FIEGELE University Hospital Innsbruck, Department of Neurosurgery, AnichstraJe 3.5, A-6020 Innsbruck, Austria
The MAQUET AWIGS system is a synthesis of an operating table, a radiolucent patient transfer board and a CT. The table rests on two columns that travel on rails in the floor of the OR with additional lateral shift capability. When the transfer board with the patient is moved back out of the CT for intervention, a registration step to the patient’s coordinate system is required. Our cooperation project aims to aid the surgeon in this step by using an automatic procedure without the need of an additional technical expert, thus reducing costs and time.
1.
Introduction
The use of intraoperative imaging requires a registration to the patient’s location when moved back out of the Computer Tomograph (CT) for intervention. For intraoperative registration, bone implanted marker screws, anatomic landmark positions or surface light scanner approaches can be found in literature. However, these approaches often stress the patient with an additional intervention, are influenced by the surgeon’s interpretation or the present light situation in operating theatre. So their intraoperative results frequently fall behind that presented in laboratory setups [l]. Often, an additional technical expert is needed, causing costs and wasting ressources. 1.1. The system The Advanced Workplace for Image Guided Surgery (AWIGS) system of MAQUET GmbH & Co KG, Rastatt, Germany is a synthesis of an operating table including a radiolucent carbon-fibre transfer board for the patient. The operating table rests on two columns that independently travel on rails in the floor of the operating room (OR). Two approaches are manufactured.
37
38 With a special CT table the operating table travels to the CT and docks to it. For intraoperative data acquisition, the patient is transferred into and out of the CT gantry with the transfer board on a moving belt. CT table and operating table can be moved together to position the patient in the gantry. In a sliding gantry system the transfer board is used together with a special operating table top. The transfer board can be unlocked and the relevant patient’s anatomy can be moved out of the table top. Here, the CT travels on rails, too. It is moved to cover the patient. After successful data acquisition it moves back to the operating position. 1.2. Environment
The AWIGS system can be set up for use in traumatology, orthopaedic-, cranioand neurosurgery i.e. and can be combined with different CT scanners from leading manufacturers. The system allows the surgeons to position the patient freely for intraoperative imaging. In such environments, additional navigation systems are in widespread use to work with the actual image data information after an intraoperative registration step. 2.
Challenges
Our cooperation project aims to aid the surgeon in registering to the patient’s coordinate system. The solution has to meet these requirements.
1. Speed meeting intraoperative requirements. 2. Exactness, accuracy according to clinical specifications. 3. Universal in the use of the different CT systems. 4. Universal in the use of the different navigation systems. 5. Easy setup for a single surgeon without additional technical expert We decided the solution neither to be invasive nor otherwise patient mounted, but referencing the AWIGS transfer board that travels together with the patient in the operating theatre.
3.
Solution
The presented solution consists of a hard- and a software part. The hardware does not need to influence conventionel operating steps. The software is designed to run on an off-the-shelf notebook.
39 3.1. Hardware
According to stereotactic head frame approaches and their basic research [2] we designed a Scan Reference Frame (SRF). It consists of titanium rods inlaid in POM plastic and can be fixated inside the scan region of interest (ROI). The first evaluations were done with neurosurgery setups. After designing and manufacturing the first SRF prototype, we found registration errors to decrease when using individual coordinates from sphere and cylindric measurement routines on a 3D measurement machine Brown & Sharp instead of CAD data. The first SRF prototype consists of seven titanum rods (see Figure 1, first row) and is reduced in manufacturing complexity and size to only six rods in the second prototype (Figure 1, second row).
Figure 1. SRF prototypes I (upper row) and I1 (lower row) in Computer Aided Design, after Computer Aided Manufacturing and their Quality Assurance documentation (from left to right).
The SRF prototype I1 is also capable of a holder clutch for a commercial navigation system’s rigid body. The clutch additionally was measured with the rigid body and its reflecting spheres mounted. These coordinates were used for the subsequent calculations.
3.2. Sofnvare The software is programmed in C++ and is devided into modules. Each module can be replaced for changes or improvements in further registration tasks. The
40
software runs on a standard off-the-shelf notebook with Intel Pentium M Processor with 1.7 GHz, 1024 MB of RAM and a nVidia Quadro FX Go700 128 MB graphics adapter. The image data is read directly from the DICOM' format. The data information in Hounsfield values can be accessed unreduced. The F i d u c i a l F i n d e r module searches for the fiducial rod's entrance points for a single slice. The L i n e F i n d e r module combines adjacent points to lines that represent the centers of the fiducial rods. These two modules are configured by parameters that were evaluated with the different scan setups. The parameters do not require intraoperative changes. The S R F module calculates the rigid body transformation out of the information from the L i n e F i n d e r module using Horn's algorithm [3]. It is configured by an XML-file that contains the necessary geometry information from the quality assurance documentation of the individual measurements. 4.
Evaluation
The evaluation scans were performed to test the algorithm and to adapt its parameters. They were performed with and without phantoms to investigate the resulting artefacts from the SRF.
4.1. Laboratory For the first laboratory setup, a Siemens Somatom Sensation 16 CT with settings as shown in Table 1 was used. The selected ROI only included our SRF, resulting in low pixespacings per slice. Table 1. CT-Scan paramaters for laboratory setup
Lab setu Manufacturer Model Number of Slices Slicespacing
Parameters Siemens Somatom Sensation 16 0.7 mm 0.25 mm
4.2. Intraoperative
For intraoperative evaluation, a General Electric HiSpeed CT was used with a setup according to Table 2. To investigate real circumstances, the ROI was much bigger than in laboratory setup, including the patient's anatomy, thus resulting in lower pixelspacings. * ACR-NEMA Digital Imaging and Communications in Medicine standard
41 Table 2. CT-Scan paramaters for intraoperative setup. Manufacturer Model Number of Slices
General Electric (GE) HiSpeed 1.5 mm 0.7 mm
5.
Results
5.1. Errors Fiducial Registration Error (FRE) This error is calculated from the deviation of the transformed fiducial points to the measured points. It describes the accuracy of the overall rigid body transformation. An error of 0.1 mm to 0.2 mm could be achieved in laboratory setups, increasing to 0.25 mm in intraoperative setups and dependent on the slice selection [4]. Target Registration Error (TRE) This error desribes the deviation for a spacial point and though is usually more relevant for clinincal use. It was evaluated for typical human head trajectories in the working volume. The target errors all stay behind 0.5 mm with a threedimensional distribution as shown in [5]. 5.2. Calculation time Calculation time depends on the number of slices to investigate and the number of relevant regions inside the image. For example, a large amount of metal causes longer calculation times while searching for the titanium fiducial rods, even when most of the ‘blobs’ remain unused. Using the whole data volume of e.g. 200 slices results in an increase of calculation time to half a minute, where 30 slices could be processed in less than five seconds. 5.3. Dependencies
The presented errors are highly dependent on the aquired pixelspacing in the image data. However, intelligent slice selection from the FiducialFinder module can produce low registration errors when combined with corresponding
42
settings for the L i n e F i n d e r module (Table 3, lower values) and additional gain of calculation time.
Table 3. Registration Errors in different setups.
FRE Prototype 11, Lab setup FRE Prototype 11,
0.25 - 0.35 m m 0.25 - 0.45 mm
6. Discussion The presented hard- and software approaches show good results, even when challenged with intraoperative imaging. The interface for commercial navigation system is already defined and has to be evaluated under clinical conditions. Future works has to be done on: The slice selection scenario. Intraoperative calculation speed. Optimal parameter adaption for the modules in different clinical scan setups. The intraoperative evaluations were done at the Department of Neurosurgery, University Hospital Innsbruck, Austria.
References 1. D. Troitzsch, J. Hoffmann, D. Bartz, F. Dammann, S. Reinert, ObefflachenLaserscanner versus Marker-Registrierung f i r die bilddatengestiitzte chirurgische Navigation, BMT 48(1), 1'12-113 (2003) 2. R.A. Brown, A Stereotactic Head Frame for Use with CT Body Scanners, Znvestigative Radiology 14(1), 300-304 (1979). 3. B.K.P. Horn, Closed-form solution of absolute orientation using orthogonal matrices, Journal of the Optical Society ofAmerica 5(7), 1127-1135 (1988). 4. J.M. Fitzpatrick, J.B. West, C.R. Maurer, Predicting Error in Rigid-Body Point-Based Registration, IEEE Transactions on Medical Imaging 17(5), 694-702 (1998) 5 . C.R. Maurer, J.M. Fitzpatrick, M.Y. Wang, R.L. Galloway, R.J. Maciunas, G.S. Allen, Registration of head volume images using implantable fiducial markers, IEEE Transactions on Medical Imaging 16(9), 447-462( 1997)
ULTRA-FAST HOLOGRAPHIC RECORDING AND AUTOMATIC 3D SCAN MATCHING OF LIVING HUMAN FACES
DOMINIK GIEL, SUSANNE FREY, ANDREA THELEN, JENS BONGARTZ, P E T E R HERING caesar foundation Ludwig Erhard Allee 2, D-53175 Bonn, Germany E-mail: {giel,frey,thelen,bongartz,hering}Qcaesar.de ANDREAS NUCHTER, HARTMUT SURMANN, KAI LINGEMANN, JOACHIM HERTZBERG Fraunhofer Institute for Autonomous Intelligent Systems (AIS) Schloss Birlinghoven, 0-53754 Sankt Augustin, Germany E-mail: {nuechter ,surmann,lingemann,hert2berg)Qai.s. fraunhof er .de 3D models of the skin surface of patients are created by ultra-fast holography and automatic scan matching of synchronously recorded holograms. By recording with a pulsed laser and continuous-wave optical reconstruction of the holographic real image, motion artifacts are eliminated. Focal analysis of the real image yields a surface relief of the patient. To generate a complete 360° patient model, several synchronously recorded reliefs are registered by automatic scan matching. We find the transformation consisting of a rotation and a translation that minimizes a cost function containing the Euclidian distances between points pairs from two surface relief maps. A variant of the ICP (Iterative Closest Points) algorithm2 is used to compute such a minimum. We propose a new fast approximation based on kDtrees for the problem of creating the closest point pairs on which the ICP algorithm spends most of its time.
1. Introduction
To treat diseases, injuries and congenital or acquired deformities of the head and neck, maxillo-facial surgeons deal with complex surgery. For example, the correction of disfiguring facial birth defects requires the manipulation of scull bones with maximum precision. The preoperative simulation of such procedures requires a 3D computer model of the patient’s face. We describe an approach to create such a 3D patient model by ultra-fast holographic recording and automatic scan matching of synchronously captured holograms. The pulsed hologram records the patient’s portrait within a single 43
44
laser shot (pulse duration appr. 35 ns). This so-called master-hologram contains the complete 3D spatial information which, due to the extremely short recording time, is not affected by involuntary patient movements. In a second step, the real image of the hologram is optically reconstructed with a cw-laser. By moving a diffusor-screen through the real image, a series of 2D images is projected and digitized with a CCD camera. This process is referred t o as hologram tomography3. Each projection shows the surface contour of the patient where the image is in focus. The method was first introduced as the locus of focus technique' in the context of non-medical imaging. Beside the desired intensity from in-focus points from the object contour, each captured image also contains a blurred background of defocused parts of the real image. The main problem of locating the surface is therefore t o distinguish between focused and unfocused r e gions in each slice. This procedure yields a relief map of the visible (as seen from the hologram) parts of the patient. In order to record a complete 360" model of a patient, multiple holograms are recorded synchronously, i.e. with the same laser pulse. Subsequently, the resulting relief maps are registered (i.e. their relative orientation is calculated) by automated scan matching. The problem of automated scan matching is to find a transformation, consisting of a rotation and a translation, that minimizes a cost function that contains the Euclidian distances between points pairs6 which both represent the same surface shape. Given that the surface shapes are acquired independently from locus-of-focus analysis of two synchronously recorded holograms, such an approach yields 360" models of complex surfaces. 2. Hologram t o m o g r a p h y 2.1. Recording and optical reconstruction
Portrait holograms are recorded with a holographic camera of the type GP25 (brand Geola) with master-oscillator and second harmonic generation. The resulting wavelength of 526.5 nm has a small penetration depth into skin t o minimize light diffusion. Careful mode selection leads t o a coherence length of approximately 6 m. The laser output is split into three beams: Two of them serve for homogeneous illumination of the object. They are expanded by concave lenses and diffusor plates at the output ports of the laser. The third beam serves as reference beam. The hologram plate (30 cm x 40 cm, VRP-M emulsion by Slavich) is developed with SM-6 and bleached with PBU-Amidol t o obtain phase holograms. A frequency doubled cw Nd:YAG laser (COHERENT Verdi V-2) is used to reconstruct
45
the holographic real image. To obtain the 2D projections of the real image, a diffusor (diffusor thickness 40 m, diameter 380 mm) is moved on a computer controlled linear positioning stage (PI M-531.DD, max. resolution 10 m) through the image volume. The diffusor images are digitized by a KODAK Megaplus ES 4.0 digital camera with 2048 x 2048 pixels. 2.2. Locus of focus
To analyze the sequence of 2D-projections, the so-called slices, we use digital image processing. As an approximation we assume that the surface derived from a single hologram has no undercuts. Therefore there can be no two surface points with the same (2, y)-coordinate and the surface can be represented by a relief map. As already mentioned, each captured slice contains the specific focused information representing the object shape contour and a defocused background. The task is thus t o distinguish between focused and defocused image regions: To evaluate the sharpness of an image in an conventional imaging system, several algorithms have been proposed in the field of image processing. We found that the best measure for image of the light intensity on pixel adsharpness is the statistical variance yz,y) jacent to ( 5 ,y). For each lateral coordinate (z, y), the sharpness measure V,,,,)(z) is a positive, real number. The axial coordinate z ( ~ ,is~ assigned ) by choosing z ( , , ~ ) t o satisfy ~ z , y ) ( z ( z , y ) )1 V(z,y)(z)V z . Thus each holographic real image gives a relief map of the object surface. 3. Automatic 3D Scan Matching
Since the relative orientation of multiple reliefs is usually not known a priori in the desired accuracy, these reliefs have to be merged in one coordinate system. This process -also known as scan matching since it originally referred to the orientation of scuns from laser triangulation systems- is called registration. The geometric structure of overlapping 3D reliefs that correspond t o a single shape has to be considered for registration. In general, scan matching approaches can be classified into two categories: (1) Matching as an optimization problem uses a cost function t o evaluate the quality of the 3D scan alignment. The range images are registered by determining the rigid transformation (rotation and translation) which minimizes the cost function. (2) Feature based matching extracts distinguishing features of the range images and uses corresponding features for calculating the alignment the reliefs.
46
3.1. Matching as an Optimization Problem The following method of registration of point sets is part of many publications so only a short summery is given here. The complete algorithm was published 1992 first and can be found, e.g., in2. The method is called Iterative Closest Points (ICP) algorithm. Given two independently derived set of 3D points, M (model set, \MI = Nm) and D (data set, ID1 = N d ) , which correspond to a single shape, we want to find the transformation consisting of a rotation R and a translation t which minimizes the following cost function:
i=l j = 1
The value of 1 is assigned to w ~ if, the ~ i-th point of set A4 describes the same point in space as the j-th point of set D. Otherwise wi,j is set to 0. Two things have t o be calculated: First the corresponding points and second the transformation R and t that minimizes E ( R ,t ) using the point correspondeces. The ICP calculates iteratively the point correspondences. In each iteration step the algorithm selects the closest points as correspondences and calculates the transformation (R,t ) for Eq. (1). I t is shown that the iteration terminates in a (local) minima2. The assumption is that in the last iteration step the point correspondences are correct. In each iteration the transformation is calculated by the quaternion based method of Horn5. A unit quaternion is a 4 vector q = (40,q z , qy,z ) T , where qo 2 0, q; q2 q i qz = 1. It describes a rotation axis and an angle t o rotate around that axis. A 3 x 3 rotation matrix R is calculated from the unit quaternion according the the following scheme:
+ + +
R=
(
+
+
2(qzqz + QyQO) (402 q: - q; - q 3 2(qzq, (2zqo) 2(qzq, Q r Q o ) (402 - 42 + 4; - qzz) 2(q,qz - 42Qo) 2(qzqz - QyQo) 2(qzq, % d o ) (402 - q: - q; 422)
+
+
+
)
.
To determine the transformation, the mean values (centroid vectors) c , and c d of the points that contribute to the matching are subtracted from all points in M and D respectively, resulting in sets M’ and D’. The rotation expressed as quaternion that minimizes equation (1) is the largest eigenvalue of the cross-covariance matrix
47
C,”=., w2. , 3. m! d’. 3y, . . . . with S,, = Cyz, W i , j m:,d:,, S,, = After the calculation of the rotation R the translation is t = ,C - R C d Fig. 1 shows three steps of the ICP algorithma, The corresponding surface meshes are given in Fig. 2.
’.
Figure 1. Registration of two 3D reliefs with the ICP algorithms. Left: Initial alignment. Middle: Alignment after 4 iterations. Right: Final alignment after 85 iterations.
3.2. Time Complexity Reduction
The ICP algorithms spends most of its time in creating the point pairs. kD-trees (here k = 3) have been suggested for speed up the data access1. aFor an animation of this result please refer t o the following website: http://uw.ais.fraunhofer.de/face.
48
They are a binary tree with terminal buckets. The data is stored in the buckets, the keys are selected, such that a data space is divided into two equal parts. This ensures that a data point can be selected in O(1ogn) at average. Recently, Greenspan and Yurick have introduced approximate kd-trees (A~r-lcd-tree)~. The idea behind this is to return as an approximate nearest neighbor pa the closest point pb in the bucket region where the given point p lies. This value is determined from the depth-first search, thus expensive Ball-Within-Bounds tests and backtracking are not necessary4. In addition t o these ideas we avoid the linear search within the bucket. During the computation of the Apr-lcd-tree the mean values of the points within a bucket are computed and stored. Then the mean value of the bucket is used as the approximate nearest neighbor, replacing the linear search. Table 1 summarizes the results. Table 1. Computing time and number of ICP iterations to align two 3D reliefs
point pairing method brute force search kD-t ree Apx-kD-tree
time 4 h 25 min 14.2 sec 10.9 sec
# ICP iterations 87 87 85
3.3. Matching Multiple 3D Reliefs
To digitalize human faces without occlusions, multiple depth maps have t o be registered. After registration the scene has t o be globally consistent. A straightforward method for aligning several 3D reliefs is pairwise matching, i.e., the new scan is registered against the scan with the largest overlapping areas. The latter one is determined in a preprocessing step. An alternative method is incremental matching, i.e., the new 3D relief is registered against a so called metascan, which is the union of the previous acquired and registered reliefs. Each scan matching has a limited precision. Both methods accumulate the registration errors such that the registration of many reliefs leads t o inconsistent scenes6. Pulli presents a registration method that minimizes the global error and avoids inconsistent scenes7. Based on the idea of Pulli we designed a method called simultaneous matching. Hereby the first scan is the masterscan and determines the coordinate system. This scan is fixed. The following steps register all reliefs and minimize the global error:
49
(1) Based on the prior knowledge about the relative coordinate systems, that needs not be precise or complete, pairwise matching is used to find a start registration for a new scan. This step speeds up computation. (2) A queue is initialized with the new scan. ( 3 ) Three steps are repeated until the queue is empty: (a) The current scan is the first scan of the queue. This scan is removed from the queue. (b) If the current scan is not the master scan, a set of neighbors (set of all reliefs which overlap with the current scan) is calculated. This set of neighbors form the point set M . The current scan forms the data point set D and is aligned with the ICP algorithms. (c) If the current scan changes its location by applying the transformation, then each single scan of the set of neighbors, which is not in the queue is added to the end of the queue. 4. Results and Conclusions
We have demonstrated that the techniques of hologram tomography and automated scan matching can be combined t o create 360' models of living human heads. Due t o the ultra-fast acquisition, these models are inherently free of motion artifacts as opposed to surfaces models recorded with laser triangulation. The ICP algorithm is thus a valuable tool t o register high-resolution models of the living human skin surface which are today commonly recorded by laser scanning and limited by the low acquisition speed in conjunction with the motion of breathing, heartbeat and involuntary movements of the patient. Even with relief models from a single hologram, holographic recordings are of value for the documentation and prediction of complex, maxillefacial surgery. We expect that the possibility to assemble multiple holographic reconstructions into accurate 3D patient models will greatly increase the acceptance of the surface models obtained by hologram tomography.
Acknowledgment The authors wish to acknowledge the cooperation with the medical groups of Prof. Dr. Dr. H.F. Zeilhofer (Kantonsspital Basel, Switzerland) and Prof. Dr. Dr. C.U. Fritzemeier (Universitatsklinikum Dusseldorf, Germany).
50
Figure 2. Surface meshes. Left: 3D relief as surface mesh. Right: Registed 2nd mesh (blue) merged with the first one.
References 1. J. L. Bentley. Multidimensional binary search trees used for associative searching. Communications of the AGM, 18(9):509 - 517, September 1975. 2. P. Besl and N. McKay. A method for Registration of 3-D Shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(2):239 - 256, February 1992. 3. D. M. Giel. Hologram tomography f o r surface topometry. PhD thesis, Mathematisch-Naturwissenschaftliche Fakultat der Heinrich-HeineUniversitat Dusseldorf, 2003. 4. M. Greenspan and M. Yurick. Approximate K-D Tree Search for Efficient ICP. In Proceedings of the 4th IEEE International Conference on Recent Advances in 3D Digital Imaging and Modeling (3DIM '03), pages 442 - 448, Banff, Canada, October 2003. 5. B. Horn. Closed-form solution of absolute orientation using unit quaternions. Journal of the Optical Society of America A , 4(4):629 - 642, April 1987. 6. A. Nuchter, H. Surmann, K. Lingemann, and J. Hertzberg. Consistent 3D Model Construction with Autonomous Mobile Robots. In Proceedings of the KI 2003: Advances in Artzjicial Intelligence. 26th Annual German Conference on A I , Proceedings Springer L N A I vol. 2821, pages 550 - 564, Hamburg, Germany, September 2003. 7. K. Pulli. Multiview Registration for Large Data Sets. In Proceedings of the 2nd International Conference on 3 0 Digital Imaging and Modeling (3DIM '99), pages 160 - 168, Ottawa, Canada, October 1999. 8. K. A. Stetson. Holographic surface contouring by limited depth of focus. Applied Optics, 7(5):987 - 989, 1968.
AUTOMATIC COARSE REGISTRATION OF 3D SURFACE DATA IN ORAL AND MAXILLOFACIAL SURGERY TOBIAS MAIER Chairfor Optics, Institutefor Optics, Information and Photonics, University of Erlangen-Nuremberg,Staudtstr. 7/B2, 91058 Erlangen, Germany
[email protected]
M. BENZt, N. SCHONt, E. NKFiNKEtt, F. w. NEUKAM~+, F. V O G T AND ~ ~ ~ G. HAUSLER~ 'Chairfor Optics, Institutefor Optics, Information and Photonics, University of Erlangen-Nuremberg,Staudtstr. 7/B2, 91 058 Erlangen, Germany ttDepartment of Oral and Maxillofacial Surgey, University Erlangen-Nuremberg,Gliickstr. 11, 91054 Erlangen, Germany '?'Chairfor Pattern Recognition, University Erlangen-Nuremberg,Martensstr. 3, 91058 Erlangen, Germany
Our aim is to support the surgeon during the adjustment of a displaccd eye ball after a fracture of the zygomatic bone. Therefore, the facial surface is repeatedly measured intraoperatively by an optical range sensor. These data have to be registered with preoperatively modeled target data in order to perform a reproducible nominallactual data comparison based on well defined criteria. Since the spatial orientation and position of the sensor relatively to the patient's face can not be presumed, the registration process via the ICP-Algorithm requires a prior coarse registration. This paper presents a problemoriented method for the fast automatic coarse registration of facial surfaces. The method exploits some characteristics of the Gaussian image that are invariant against the facial form changes caused by the surgical operation.
1. Introduction The zygomatic fracture associated with a dislocation of the eye ball is one of the most frequent traumata to the facial skeleton. The success of adjusting the correct globe position depends on the experience of the surgeon. In the last two years we have developed a system based on an optical range sensor that supports the surgeon during the operation by comparing intraoperatively gained actual 3D data with preoperatively computed 3D nominal data of the face (as triangle meshes) and the eye globe position. During the operation the surgeon can easily acquire actual 3D data at any times with low time cost and without any radiation exposure. In order to give practical feedback to the surgeon some clinical characteristics are calculated indicating the relative globe position [ 1,2]. The data have to be rigidly registered for comparison. The registration process is done in two steps: Coarse registration and fine registration. The fine 51
52 registration is done by default via the Iterative-Closest-Point-Algorithm (ICP) [3] whereas the necessary coarse registration is often accomplished by user interaction [4]. Despite of several approaches the reliable real time automation of geometry based coarse registration still remains a challenge. Moreover we need an algorithm that deals with data that are subject to form changes by the operation. 2.
State of the art
One can find several methods for the geometry based coarse registration of three-dimensional free-form surfaces in the literature. The majority of the approaches are feature based and favor the calculation of local differential geometric [5] or statistical point features [6]. As a global method [7] might be mentioned, that is based on a Hough-Table. As a general rule these algorithms hold a high degree of universality and are time-consuming in implementation and runtime. Our approach is based on the Gaussian image of an object (see methods). The application of Gaussian images can primarily be found in the field of object or form recognition, for example [8]. It is often used in an extended and more complex form, for example for symmetry detection [9]. The application for matching has been limited so far to static objects (surfaces constant in time) that are closed or convex like in [lo]. The application for the registration of threedimensional facial surface data (non-convex and non-closed data) has been introduced for the first time by [ l l ] . The paper in hand describes an improvement of this algorithm. 3.
Methods
The hardware of the system is built up by an optical range sensor according to [12] that performs fast, highly accurate and non contact 3D surface data acquisition (Fig. 1). One measurement needs 640 ms and has an uncertainty of 200 pm. Two cameras are used simultaneously in order to gain a wider field of view. The sensor is connected to a commercial PC and controlled by the software 3D-CamB. The sensor calibration is accomplished according to [ 131. Sensor hard- and software is produced by 3d-shape;. The measurements generate raster range data that are converted to triangle meshes after some data preprocessing. The data are smoothed by the normals
* http://www.3d-shape.com
53 filter according to [ 141. The preprocessed data sets of faces consist of 20,000 to 30,000 vertices.
Figure 1. Left: Optical 3D-Sensor (phasemeasuring triangulation) built up by a light-source with a projector (active illumination) and two cameras bedded on an adjustable and moveable tripod. Right A sinusoidal stripe pattern is projected onto the patient’s face and measured by the cameras.
The nominal 3D data of the patient’s face and eye position are computed preoperatively from the first data acquisition [ 151. Fine registration is performed by an optimized ICP variant according to the hndamental results of [16]. The preceding coarse registration is automated by our approach that adopts the Gaussian image of a face. The Gaussian image of an object is built up by its surface normals that are anyway calculated for the visualization of the 3D data. A Gaussian image looks like a dotted unit sphere surface. A short repeat of the method in [ 113 is given below: The algorithm discretises the Gaussian image by an overlaying cubic lattice. Only those cells of the cubic 3D lattice that intersect the sphere surface and contain normals are used for all further calculations (Fig. 2a). A single feature - the so called cell density - is calculated for each cell by counting the enclosed normals and normalizing this number by the area of intersection. The area of intersection is fastly approximated by the inscribed triangle planes that result from the intersection points between the sphere surface and the cell borders as outlined in Figure 2b. The idea is to search for a compact region in the Gaussian image where a lot of normals are close together. Every normal is assigned to a single vertex of the 3D surface data. Because of the human face being roughly convex, such a
54
dense region should nearly represent a contiguous shallow region in the face like a cheek for example. Searching the densest cell indeed leads to a corresponding subset of vertices of the facial data that basically represents a part of the patient’s cheek (Fig. 3).
Figure 2. a) Gaussian image discretised by a cubic lattice. Discretising scheme here is 30’ cells. b) Principle of the fast approximation of the area of intersection between a sphere surface and a cubus.
This region remains unaffected by surgery in cases of a zygomatic fracture. The subsets from different data but from the same person have similar patchy shapes. These shapes are suited for registration by principal axes transformation.
55
Principal axis transformation 1
Figure 3. Video images, Gaussian images with the simple densest cell feature of [ I I ] and the 3D surface data with marked corresponding subsets of vertices. The principal axes of the subsets are denoted underneath. The sphere discretization scheme was 15) cells. LeJ: Healthy person. Right The same person with 6 ml saline solution injected into the malar region.
The algorithm has been tested by applying it to model cases created by a number of healthy persons: In order to simulate form changes in the face we injected 6 ml saline solution in the malar region and registered the 3D data without and with injection. It has to be pointed out, that the principal axis transformation of the whole data is out of the question. Though always containing the region of interest, the data generally do not show the same display detail of the face because of changing camera positions during the operation (the system is designed to be flexible). Moreover principal axis transformation of the whole data set, including parts that are changing their forms, is not suited for registration.
Figure 4. The same measurements as in Figure 3, but the densest cell feature is now given by the extended definition of cell and neighbor cells. The discretization scheme is now 303 cells. Obviously the shapes of the subsets are vety similar.
56 Although sufficient in many cases, there is much room for improvement: In this paper we achieve a higher robustness of the algorithm by modifying the feature search strategy: First, the cubic cell size has been chosen lower to make the discretization finer. We changed the scheme from 203 to 303 lattice cells. Secondly, the search for the densest cell has been extended from single cells (Fig. 3) to the combination of one cell and its neighboring cells on the unit sphere (Fig. 4). On the one hand this procedure increases the time cost. On the other hand it allows a more uniform feature sampling and thus the patchy shapes are more similar. 4.
Results and Discussion
The results showed both that the computing time is still in the range of our requirements. Finding the densest region in a Gaussian image by the above described method needs less than two seconds on a 1,3 MHz machine (512 MB RAM). The similarity of the subset shapes is generally higher than before, where only the single discretisation cells were considered. This improvement leads to better coarse registration results. Better coarse registration means faster convergence of the subsequent optimisation by the ICP-Algorithm. As a consequence of the more uniform sphere sampling the densest region in the Gaussian image and the shape of the corresponding subset of vertices is more robust against arbitrary rotation.
Figure 5. 3D data and cutting profiles of the test person in Fig. 4. Lefr: Unregistered data. Middle: after automatic coarse registration.Righr: After fine registration (ICP).
57 5.
Conclusion and Future Work
This paper is a contribution to the field of 3D image registration. The described approach deals with free form surfaces (not volume data) that are partially changing in time. In this context it might be called a 4D registration. The offered solution is problem-oriented because of it’s limitation to oral and maxillofacial surgery. It works in real time and is hence suited for the intraoperative use. The methodical advantage of our approach compared to existing ones are the limitation of the search space to two dimension (surface of the unit-sphere) and the simplicity of the feature definition (one-dimensional feature space). Future work will cover the coarse registration problem in the context of further applications of the system in oral and maxillofacial surgery, for example in cases of jaw-bone relocation.
Acknowledgements This research project is funded by the DFG (Deutsche Forschungsgemeinschaft) within the subproject C4 in the framework of the SFB603: http:Nsfb-603.uni-erlangen.de/HTML/sfb603-g.html
References 1. E. Nkenke, M. Benz, T. Maier, M. Kramer, G. Hausler, J. Wiltfang, F. W. Neukam: Relative exophthalmometry in zygomatic fractures using an optical sensor. Procs CARS’03 ,p.1407,2003 2. E. Nkenke, M. Benz, T. Maier, J. Wiltfang, L. M. Holbach, M. Kramer, G . Hausler, F. W. Neukam: Relative en- and exophthalmometry in zygomatic fractures comparing optical non-contact, non-ionizing 3D imaging to the Hertel instrument and computed tomography. Journal of Craniomaxillofacial Surgery, 3 1(6), pp362-368,2003 3. P. Besl, N. McKay: A method for registration of 3-d shapes. IEEE Transactions on Pattern Analysis and Machine Intelligence, 14(2), pp.239256, 1992 4. T. Maier, M. Benz: Interactive coarse registration of triangle meshes. Lehrstuhlf i r Optik, Annual Report, ~ 3 0 , 2 0 0 2 5 . J . P. Thirion: New feature points based on geometric invariants for 3D image registration. International Journal of Computer Vision 18(2), pp12 1137,1996 6. N. Schon, M. Benz, T. Maier, E. Nkenke, F. W. Neukam, G . Hausler: Informationsoptimierte Merkmale zur Grobregistrierung von FreiformFlachen. Procs BVM’04,2004, in print
58 7. S. Seeger, G. Hausler: A Robust Multiresolution Registration Approach. Procs VMV’99, pp75-82, 1999 8. T. Chaperon, G. Goulette: Extraction of cylinders in full 3D data using a random sampling method and the Gaussian image. Procs VMV’OZ, pp3542,2001 9. C. Sun, J. Sherrah: 3D symmetry detection using the extended Gaussian image, IEEE Transactions on Pattern Analysis and Machine Intelligence 19(2), ~ ~ 1 6 4 - 1 61997 8, 10. S. Kang, K. Ikeuchi: The Complex EGI: A New Representation for 3-D Pose Determination. IEEE Transactions on Pattern Analysis and Machine Intelligence 15(7), pp707-72 1, 1993 11. T. Maier, M. Benz, G. Hausler, E. Nkenke, F. W. Neukam, F. Vogt: Automatische Grobregistrierung intraoperativ akquirierter 3D-Daten von Gesichtsobefflachen anhand ihrer Gaulj’schen Abbilder. Procs BVM’03, ppll-15,2003 12. M. Gruber, G. Hauler: Simple, robust and accurate phase-measuring triangulation. Optik 89(3), pp 118-122, 1992 13. K. Veit: Verringerung systematischer Meljfehler bei der phasenmessenden Triangulation durch Kalibration. PhD Thesis, University of ErlangenNuremberg (Germany), 2003 14. S. Karbacher: Rekonstruktion und Modellierung von Flachen aus Tiefenbildern. PhD Thesis, University of Erlangen-Nuremberg (Germany), 1997 15. M. Benz, X. Laboureux, T. Maier, E. Nkenke, S. Seeger, F. W. Neukam, G. Hausler: The Symmetry of Faces. Procs VMV’OI,pp43-50,2002 16. X . Laboureux, G. Hausler : Localization and registration of three-dimensional objects in space - where are the limits? Applied Optics 40(29), ~ ~ 5 2 0 6 -16,200 52 1
AUTOMATED MARKER DETECTION FOR PATIENT REGISTRATION IN IMAGE GUIDED NEUROSURGERY RENE KRISHNAN, ELVIS HERMANN, ROBERT WOLFF, ANDREAS RAABE AND VOLKER SEIFERT Department of Neurosurgery, Johann Wolfgang Goethe-University, Schleusenweg 2-1 6, 60528 FrankfirtMain, Germany
Fiducial marker registration within prcopcrative data is often left to the surgeon, who has to identify and tag the centcr of all markers, which is time consuming and a potcntial source of error. For this reason an automated proccdure was developed. In this study, we investigatcd the accuracy of this algorithm in detecting markers within navigation data. We used the BrainLAB planning software VVplanningl.3 (BrainLAB, Hcimstettcn, Germany) to dctcct a total of 591 applied fiducial markers in 100 consecutive MP-RAGE MRI data sets of patients scheduled for image guided surgery. The software rcquircs to adjust different parameters for the size (“accuracy”) and grey-levcl (“threshold”) to dctcct marker structures in thc data set, where “threshold” describes the grey-level, above which the algorithm starts searching for pixel clusters representing markers. This parametcr was stepwise changed on the basis of a constant “accuracy” value. The size of a marker (accuracy) in y-direction was stepwise changed on the basis of that threshold value, which detected all markers correctly. Time for marker detection varied between 12 and 25 seconds. An optimum “accuracy” value was found for 1 ,I mm with 8 (1,35%) undetected markers and 7 (1,18%) additional detected structures. The average grey level (threshold) for correct marker detection was 248,9. For a high “threshold” the rate of missed makers was low (1,86%). Starting the algorithm at lower grey levels decreased the rate of missed markers (0,17%) but the rate of additional detected structures raised up to 27,92%. The automatic marker detection algorithm is a robust, fast and objective instrument for reliable fiducial marker registration. Specifity and sensitivity is high when optimum settings for “threshold and “accuracy” are used.
1.
Introduction
Modem fiameless stereotactic devices for localization in the treatment of subcortical lesions have revolutionized the practice of neurosurgery [ 11. One of the important steps for determining the accuracy of these systems is the intraoperative registration of the patient head position [2]. The use of fiducial markers fixed to the patient’s head, is widespread for this purpose, although laser surface registration techniques became commercially available and clinically applicable [4,5,6]. The registration of the markers in the data set is often left to the surgeon, who has to identify them and tag the center of all applied markers, which is time consuming and a potential source of error [7]. For this reason the development of an automated procedure to detect the fiducial markers was desirable [8]. 59
60 In this study, we have investigated the accuracy of a software algorithm for detecting markers in data sets used for image-guided surgery. The influence of the adjustable values for accuracy and threshold on the sensitivity and specifity of the automatic marker detection process were reviewed. 2.
Patients and Methods
In this retrospective study an automatic marker detection algorithm in was used in 100 consecutive patient data sets, 52 male and 48 female patients (age ranged from 1 to 82 years with mean age of 49 years) with different pathologies. 2.1. Placement offiducial markers
Six fiducial markers were attached to the skull before preoperative 3D imaging. The markers were clamped into a plastic base, which was fixed to the skull by adhesive tape discs. The position was marked with indelible ink. As shown by other investigators, the distribution of the fiducial markers is one of the key functions for optimization of registration accuracy [7,9]. 2.2. Data acquisition
Three-dimensional magnetization-prepared rapid gradient echo (MI'-RAGE) MR image volumes (TR 9,7 ms; TE 4 ms; flip angle 12 deg.; slab 170 mm; 170 slices; slice thickness 1 mm; FOV 256 mm, matrix 256 x 256) were acquired using the head coil in a Siemens Magnetom Vision 1,5-tesla scanner. Scanning time was about 8 minutes. Gadolinium was fiequently administered. 2.3. Marker detection Applied fiducial markers were depicted in the data set as pixel clusters of a given expansion and contrast. The markers usually appear smaller in the MRI scans due to noise at the edges and cracks, depending on the resolution and the partial volume effect as a dependent parameter of the image-set. We have used the VVPlanning 1.3 software (BrainLAB, Heimstetten, Germany) for preoperative planning and data preparation for intraoperative use with the VectorVision2 neuronavigation system (BrainLAB, Heimstetten, Germany). The implemented algorithm for automatic marker detection is based on a two-step procedure with two adjustable parameters, which both were investigated. 1. The maker segmentation threshold (gray value) at which a marker is recognized and included for further analysis is the first step. The control
61
parameter in the program is named “threshold”. It’s default value is set to a gray value of 250 for MRI scans and can be changed in the program. The marker size is the second step in analyzing candidate markers. Size 2. describes the median ideal diameter of a marker in three dimensions. The size value in x- and z-direction is a predefined and fixed variable, written to a file in the system software, where it can only be changed. The default size (accuracy) for CT markers is set to 7,O millimeter, for MRI it is set to 6,5 millimeter. The adjustable marker size value describes the size of a marker in y-direction, as the size in this direction depends on the slice thickness and is therefore more inaccurate. The control parameter is named “accuracy” by the software company, but should rather be addressed as marker size or adjustable marker size to avoid confusion, as it has nothing to do with the accuracy of the method or system. 2.4. Data analysis
The values for threshold (range 255 - 230 with 6 intervals of 5 units) and accuracy (range 2,l - 0,l cm with 0,2cm intervals) were varied randomly in all MRI data sets. The influence of the gray level (threshold) value on the sensitivity of marker registration was tested against a constant size (accuracy) value of 1,3 cm in all data sets. For accuracy the test was performed at the specific gray level, at which all applied markers were registered correctly. In the 100 MRI data sets a total of 59 1 markers were applied and had to be recognized. The results were written to a database and it was noted, how many markers have been detected or not and how many structures were falsely misinterpreted as markers.
3.
Results
The time needed for automatic marker detection varied between 12 and 25 seconds (mean 19,4 f 4,07 seconds). There was no system failure during the trial. False positive detected markers were easily eliminated by browsing through the acquired points with a mouse click and delete these additional structures. If the algorithm missed fiducial markers there are two options to carry on. The first option is the manual localization of the fiducial, which is time consuming and yields the uncertainty of missing the center of the fiducial by tagging it outside the slice that would be the middle of the marker. The second option is to re-start the algorithm with changed parameters, like lower threshold level or/and higher accuracy levels, which both result in an increase of marker detection, but increase the number of false positive detected markers, too. But theses false points can easily be eliminated as stated above.
62 3.1. Influence of Marker Size
The algorithm searches for pixel clusters above a certain gray level (threshold) which dimensions are 6,5 mm in x- and z-direction and 6,s f accuracy mm in ydirection. An optimum value of 1,l mm was found for adjustable marker size, with 8 undetected markers (1,35%) and 7 additional detected structures mimicking markers (1,18%) out of 591 markers. The rate of markers not detected by the algorithm was below 5% for marker size values between 0,7 and 2,l mm and increased up to 80,71% for a marker size value of 0,l mm. At the same time the rate of falsely detected additional markers, which was 32% for greater marker size values, decreased to 0% for small values. 3.2. Influence of Threshold The gray level the algorithm started the analysis with was varied in 5 steps with a constant marker size of 1,3 mm in all data sets. In 14 % the marker detection algorithm found all applied markers starting at a gray level of 255 and in 63 % for a starting gray level of 250. Figure 2 summarizes these data. The mean gray level for all 100 data sets above which the marker detection was correct was 248,9. The automatic detection of markers was good for higher gray levels with a rate of 11 missed makers (1,86%). By starting the algorithm at lower gray levels the rate of missed markers decreased to 0,17%. At the same time the rate of additional structures misinterpreted for markers raised from 5,92% for high gray levels up to 27,92% for lower starting gray levels (see Figure 3). 4.
Discussion
Most neuronavigation systems in use today use external fiducial markers for patient registration [ 10,11,7,12]. Before surgery a series of corresponding points (23) has to be identified in both image and physical space. The computer determines a transformation between the image and physical space. With this method an average application accuracy of 2-7 mm can be achieved [2,13,14,15]. Accuracy is important to these systems, as is knowledge of the level of that accuracy to the surgeon. An advantage of marker-based systems is that registration error depends only on the fiducial registration error (FRE) and is thus to a large extent independent of the particular object being registered.
63 4.1. Interpreting accuracy values in image guided surgery
To avoid confusion when comparing accuracy values it is important to basically distinguish between several error metrics describing the quality of a registration as shown by Fitzpatrick et al. [I61 1. The goodness of fit derived calculated error given by the computer after a registration procedure (FRE) and 2. The real error between the image and the patient's anatomy during surgical navigation (TRE). The former is a measure of the distance between corresponding fiducial markers after registration and transformation, i.e., the minimum value of a cost hnction that has been used to minimize the error between two corresponding sets of points. It is used to provide a measure of the registration quality. Thus, the error may be larger or smaller in the periphery, i.e. the site of approach. The latter (TRE) is the distance between points not used to estimate the registration transformation parameter. This ,,true" accuracy is also dependant on the geometrical relationship between the region of surgical interest and the objects used for registration (i.e. fiducial array for point registration or surface for surface registration). Accuracy will not be uniform throughout the intracranial volume and when the surgical field is distant from the objects of referencing, a lever-arm effect may produce a significant TRE. Clinically, the TRE within the surgical field is most important and it is a fundamental requirement not to rely on the FRE calculated by the system but to check application accuracy before starting any kind of image guided surgery. Poor registrations caused by poor fiducial configurations may appear to be good due to a small FRE value [ 161. FRE should rather not be used as a direct accuracy feedback since it does not take several factors into account: 1.
2. 3.
the accuracy of a registration generally increases with the number of used registration points the accuracy of a registration depends on location and can thus not be represented as a single number the accuracy of a registration depends on a variance and distribution of the used points.
5. Conclusion Automated marker detection is very time effective and due to its independence of the user's interpretation of a markers center it is an objective method of marker registration in the navigation data. Optimization of the adjustable parameters of size and accuracy minimizes the rate of falsely recognized or not detected markers. By eliminating the observer dependent fiducial localization
64
error with an automatic procedure the quality of image guided surgery can be improved.
References 1. Apuzzo,M.L., Chen,J.C., Stereotaxy, navigation and the temporal concatenation, Stereotact. Funct. Neurosurg. (1999) 82- 88 2. Alp,M.S., Dujovny,M., Misra,M., Charbel,F.T., Ausman,J.I., Head registration techniques for image-guided surgery, Neurol. Res. (1998) 3 1- 37 3. Raabe,A., Krishnan,R., Wolff,R. et al, Laser Surface Scanning for Patient Registration in Intracranial Image-guided Surgery, Neurosurgery (2002) 797- 803 4. Kremser,C., Plangger,C., Bosecke,R. et al, Image registration of MR and CT images using a frameless fiducial marker system, Magn Reson. Imaging (1997) 579- 585 5. Lewis,J.T., Galloway,R.L., Jr., Schreiner,S., An ultrasonic approach to localization of fiducial markers for interactive, image-guided neurosurgery-Part I: Principles, IEEE Trans. Biomed. Eng (1998) 620- 630 6. Maurer,C.R., Jr., Maciunas,R.J., Fitzpatrick,J.M., Registration of head CT images to physical space using a weighted combination of points and surfaces, IEEE Trans. Med. Imaging (1998) 753- 761 7. Germano,l.M., Villalobos,H., Silvers,A., Post,K., Clinical use of the optical digitizer for intracranial neuronavigation, Neurosurgery (1999) 26 1- 269 8. Wang,M.Y., Maurer,C.R., Jr., Fitzpatrick,J.M., Maciunas,R.J., An automatic technique for finding and localizing externally attached markers in CT and MR volume images of the head, IEEE Trans. Biomed. Eng (1996) 627- 637 9. West,J.B., Fitzpatrick,J.M., Toms,S.A., Maurer,C.R., Jr., Maciunas,R.J., Fiducial point placement and the accuracy of point-based, rigid body registration, Neurosurgery (2001) 810- 8 16 10. Barnett,G.H., Miller,D.W., Weisenberger,J., Frameless stereotaxy with scalp-applied fiducial markers for brain biopsy procedures: experience in 218 cases, J. Neurosurg. (1999) 569- 576 11. Brommeland,T., Hennig,R., A new procedure for frameless computer navigated stereotaxy, Acta Neurochir. (Wien. ) (2000) 443- 447 12. Roessler,K., Ungersboeck,K., Dietrich,W. et al, Frameless stereotactic guided neurosurgery: clinical experience with an infrared based pointer device navigation system, Acta Neurochir. (Wien. ) (1997) 551- 559 13. Gumprecht,H.K., Widenka,D.C., Lumenta,C.B., BrainLab VectorVision Neuronavigation System: technology and clinical experiences in 131 cases, Neurosurgery (1999) 97- 104 14. Helm,P.A., Eckel,T.S., Accuracy of registration methods in frameless stereotaxis, Comput. Aided Surg. (1998) 51- 56
65
15. Sipos,E.P., Tebo,S.A., Zinreich,S.J., Long,D.M., Brem,H., In vivo accuracy testing and clinical experience with the ISG Viewing Wand, Neurosurgery (1996) 194- 202 16. Fitzpatrick,J.M., West,J.B., Maurer,C.R., Jr., Predicting error in rigid-body point-based registration, IEEE Trans. Med. Imaging (1998) 694- 702
This page intentionally left blank
Advanced Navigation and Motion Tracking
This page intentionally left blank
CLINICAL RELEVANCE OF PREOPERATIVE CT- BASED COMPUTER AIDED 3D- PLANNING IN HEPATOBILIARY, PANCREATIC SURGERY AND LIVING DONOR LIVER TRANSPLANTATION JENSHARMS MD Klinikfir Visceral-. TransplaNtations-, Thorax- und Gefasschirurgie, Universitat Leipzig AOR, Liebigstrasse 2 0 , D- 04 I03 Leipzig, Germany
XI
HOLGER BOURQUAIN MD OLDHAFER MD , H - 0 PEITGEN PHD, J HAUSS MD, J FANGMNN MD
MeVis- Centerfor Medical Diagnostics and Visualization, Universitatsallee 29 D- 28359 Bremen, Germany
Multiple imaging approaches are currently used for diagnosis and surgery planning of hepatobiliary tumors and living donor liver transplantations. Conventional imaging studies remain insufficient to demonstrate the individual anatomy. Refinements in CT technology with the introduction of cmultidetector-row)) CT scanners and implementation of mathematical methods on computerized digital data enabled CT based 3Dvisualizations. This renders preoperative surgery planning more reliable and reproducible. Since the application in oncological liver surgery has been studied previously, our interest focussed on pancreatic- and biliary tract tumors including preoperative work-up in living donor liver transplantation. A total of 27 patients were assessed. CT based 3Ddisplay provided accurate preoperative visualization and computerized risk analyses for safely margins of pancreatic- and biliary- tract tumors. In living related liver transplantation the 3D- procedure may help to recognize vascular variants and to define the splitting line. The results may have major impact on patient selection and in our opinion allow better planning of the appropriate surgical approach.
1. Introduction Hepatobiliary- and pancreatic surgery as well as liver transplantation have shown considerable developments, mainly due to improvements of surgical techniques, diagnostic imaging modalities and postoperative care. Technical advances with the use of surgical devices such as ultrasonic dissectors, argon beam coagulators improved accuracy of surgical procedures. Recent developments of imaging techniques with the implementation of computer technologies have enabled a new quality of visualization consisting in 3Drepresentation to realize image guided and computer assisted surgery. These developments allow enhanced precision in preoperative planning and image
69
70 guided surgery using intraoperative navigation tools. This is already practiced in neuro-, maxillo-facial and orthopaedic surgery . 3D- visualizations of visceral organs failed in the past because of technological limitations to create image data with minimal motion artifacts and lack of stable computerized image processing technologies. The introduction of ccmultidetector-row)) helical CT, however, has offered the opportunity to create digital data with minimal motion artifacts'. The application of a variety of dedicated and robust computerized image processing steps succeeded in visualizations, which enable precise localization and exact in- depth representation of tumors especially in surgery of liver cancers293.So far there is no consensus among surgeons about the indicative value for treatment planning. Points of criticisms are the time lag needed for computed visualizations, nonavailability of real-time display required in the operating room and the lack of specialized medical computer scientists. Applications of laparoscopic techniques in hepatobiliary surgery are limited so far. Improvements in the field of 3D- visualization and navigation techniques may allow major liver resections via a laparoscopic approach. This development is already underway in upper gastrointestinal tract surgery. Because of limited availability of postmortal liver donors, the challenging technique of living related liver donor transplantation (LDLT) has gained increasing application worldwide. Especially this field reinforced the interest in computerized preoperative 3D- visualizations. Adult-to-adult LDLT is technically demanding. The liver has to be splitted into in a well preserved right lobe representing the graA and a remnant left liver lobe without any damage to the donor. To achieve these aims accurate anatomical and functional work- up for surgical planning is mandatory. The same holds true for planning of oncological resections in biliary- and pancreatic tumors. In this field experiences with 3D- visualization are limited. To analyse the clinical impact of computerized 3D- visualizations in hepatobiliary - pancreatic surgery and LDLT we performed a collaborative study with the IT- research institute MeVis, Bremen ,Germany. 2.
Material 8z Methods
A total of 27 patients were assessed by CT based 3 D- visualization techniques. 12 patients were analysed for pancreatic- ( head n= 9, corpus n= 2 , tail n = l), 4 patients for biliary tract tumors and 10 patients for LDLT. In addition 3 liver transplant recipients were analysed. CT scans were performed with a 4 slice c(multirow-detectom helical scanner (Siemens Volume Zoom@, Siemens Erlangen ,Germany). For computer assisted 3D- viszualisation and image
71
analysis data were transferred to the IT- Research Institute MeVis, Bremen, Germany. 3D- image processing of original CT- data included segmentation of specific anatomic and pathologic structures. For relevant vascular structures centre lines were calculated. A hierachical mathematical model representing the vascular tree was created. This allowed calculation of individual vascular territories. Computerized “surgery planning” included virtual insertion of splitting lines in LDLT and safety margins in oncologic patients. Results were displayed either one by one or in arbitrary combinations in both, 3D- and overlayed to the original CT 2.1. Results
Pancreatic Tumors: 3D- visualization of pancreatic tumors succeeded in 11 of 12 cases. In a single case of cancer of the papilla of Vater with the tumor mass localized within the duodenum visualization was unsatisfactory. In 2 patients arterial variants of the common hepatic artery originating from the superior mesenteric artery could be displayed. Visualization of vascular structures in the vicinity of the tumor succeeded in all cases. Involvement of regional arteries by the tumor (gastroduodenal artery n= 3, superior mesenteric artery n= 1, celiac trunk n= 1, lineal artery n=1) or of the venous mesenterico- portal trunk (portal vein n= 1, confluens n= 2, superior mesenteric vein n= 1, lineal vein n= 1) could be shown in 5 patients. On the basis of computerized surgery planning, resection of the tumor seemed likely in 9 cases. Operative procedures included duodenopancreatectomy in 7 and extended left sided pancreatic resection in 2 patients. In the remaining two patients palliative surgery was carried out because of multivisceral tumor infiltration. In all patients the specific findings (tumor size, localization, vascular involvement) obtained by 3D- CT visualization were confirmed intraoperatively. Because of non-resectability one patient was treated with palliative chemotherapy. Biliary Tract Tumors: 3D- CT based visualizations demonstrated localization, extraluminal extend and involvement of adjactend vascular structures within the liver hilum and the hepatoduodenal ligament. Longitudinal ductal tumor extension could be demonstrated as well. ERC- and MRC- examination revealed biliary tract tumors classified as Bismuth type IV in 1, Bismuth type I in 2 and tumor of the distal common hepatic duct in one case. 3D- CT based visualizations were discrepant in 2 cases. ERC- and MRC classified Bismuth type I tumors appeared as Bismuth type II/IIIa in 1 and as a tumor originating from the gall bladder in 1 case. Compared to conventional radiological methods, 3D- CT based visualizations seem to be more precise to determine the extent of tumor spread and vascular involvement. Thus 3D- CT based visualizations representing the advantage of a single non- invasive examination of biliary
72 tumors may complement conventional diagnostic tools and may improve treatment planning. Living Donor Liver Transplantation: 3D- imaging studies were performed in 10 consecutive candidates evaluated for adult-to-adult living donor liver transplantation. Appropriate 3D- visualizations, of the arterial anatomy succeeded in 8, of the portal vein anatomy in all, of the hepatic venous anatomy in 9, of the hepatic vein confluens in 9 and of the bile duct anatomy in all 5 patients in which a biliary contrast agent was administered visualizations. Hepatic arteries, veins and liver segments could be displayed in a multidirectional 3D- view. Course and branching points of hepatic arteries, veins and their relation to the intended splitting line (full right split and left lobe remnant liver) could be visualized in arbitrary combinations. Anatomic variants detected were: (1 .) accessory left hepatic arteries in 2, hepatomesenteric trunk in 1; (2.) trifurcation of the portal vein in 2; (3.) variant hepatic veins (HV) in 6 (intraparenchymal entry of LHV in MHV n= 3, RHV in MHV n=l, accessory RHV n= 1, two RHV n=l); (4.) normal anatomy and branching patterns of bile ducts in 5 patients examined. On the basis of the individual anatomic findings 9 candidates were assumed appropriate for LDLT. LDLT was not performed in 3 patients. Two donors had steatosis that excluded donation. One recipient showed tumor progress. During the study period successful LDLT were performed in 2 while the others are awaiting the procedure. In summary, our data indicate that 3D-CT based visualization facilitates diagnostic work-up with high accuracy. Multiple examinations especially with regard to invasive diagnostics may be avoided in future. Liver Transplant Recipients: 3D- CT imaging studies were also conducted in 3 liver transplant recipients. Examinations were requested for assessment of tumor progress in 1, confirmation of portal vein thrombosis and deleneation of consequtive venous collaterals in another and in a 6- months old infant suffering from progressive hepatic failure due to Alagille syndrome. The patient with a substantial tumor progress was excluded from transplantation. The suspected portal vein thrombosis was confirmed. Impressive results were obtained in the 6- months old infant. Alagille syndrome is associated with multiple anatomic anomalies. Beside other diagnostics 3D- CT based visualization revealed abnormal course of the inferior caval vein, lack of hepatic venous confluens and an arterial hepatomesenteric trunk. In this case results obtained were more conclusive than other imaging studies. Intraoperative display allowed image guided surgery with the consequence that the operative risk during surgical dissection could be kept low. (Fig. la-c and 2 a,b).
73
Figures I a-c: (a) Preperative 3D- CT based visualization of the complex intraabdominal vascular variants. (b) Intraoperative view after hepatectomy confirmed vascular courses as obtained by computerized 3D- display (c) Removed cirrhotic liver
Figures 2 a,b: (a) Preoperative CT based computer aided 3D- planning of the LDLT. The splitting line follows the falciform ligament for removal of a graft consisting of left lateral liver segments. Areas of arterial malpefision of the remnant liver are visualized (b) Intraoperative view after removal of a left lateral graft (liver segments I1 + 111)
3. Discussion Research on computer assisted surgery has expanded rapidly allowing application for clinical procedures on routine basis. Two fields of engineering technologies are required to realize computer assisted surgery in visceral organs: (1) a surgical simulation system to realize planning according to the condition and anatomical feature of the patient, (2) an image fusion system which is applicable for visceral organ surgery and acts as an apperatus for image guided surgery. Application of computerized segmentation techniques on digital data derived from computer tomography facilitate 3D- geometrical and structural organ analyses. Visualisation permits multiple viewing perspectives. Interactive definition of resection lines ensures preoperative identification of “saftey margins” and “areas of risk” in oncological patients and in living related liver donors. Results achieved are accurate and robust. Data may be used as a “virtual road map” during surgery. Because of the inability of precise intraoperative registration image-fusion during visceral surgery render computer assisted surgery presently impossible. Physiologic organ shifting and soft organ composition with deformation during surgical manipulation are obstacles. The actual manual operating process is
74
guided by fusion of visual and tactile information. For image assistance and representation during surgery developments in the design of modem operating rooms with the operational availability of computer technologies offer an interim solution. Consequently computerized 3D- visualizations can be transferred and displayed on flat screen monitors near to the surgical field. Pancreatic Tumors: Imaging of pancreatic tumors shows considerable progress. About 35% of pancreatic tumors are demonstrated to be resectable with curative intent4.Assessment of resectability, in most cases remains unclear until surgical exploration. Conventional diagnostics often fail to provide accurate assessment of regional tumor infiltration. Further limitation is the inability to depict small hepatic and peritoneal metastases. Neither conventional CT techniques, nor MRI and angiography can rule out these problems. From the surgical point of view 3 issues have to be addressed clearly by the preoperative staging of pancreatic tumors: (1.) local resectability; (2.) lymph node metastases and (3.) distant metastases. Accuracy for these determinants can be established by CT- and MRI in 71% and 70% ,respectively5.Computerized CT based 3D- visualization techniques may allow improved precision of staging pancreatic tumors that includes: (1.) interactive visualization of the pancreatic tumor (2.) tumor size (3.) display of the vascular anatomy; i.e. variants which are essential for surgical dissection and lymphadenectomy; (4.) tumor involvement of vessels; (5.) preoperative computerized resection planning in order to ensure adequate “safety margins”. The potential of CT based 3D- visualizations to detect metastatic lymph nodes remains to be shown. In summary, our preliminary experience suggests that 3D- modeling of CT data should be included in staging of pancreatic tumors in selected cases. Small peritoneal and hepatic metastases can not be visualized sufficiently. In such suspected cases laparoscopy is useful and favorable compared to diagnostic laparotomy. Biliary Tract Tumors: Evaluation of biliary tract tumors is an assessment of resectability, since resection is the only effective therapy. Resectability ranges between 10- 50% with 5-year survival rates of 20%. Preoperative staging must assess 4 critical issues: (1.) extend of tumor within the biliary tree, (2.) vascular invasion, (3.) hepatic lobar atrophy, (4.) metastatic disease6. Cholangiography (ERC) demonstrates the location of the tumor and the ductal extent of the disease. The procedure however carries a considerable risk. In addition MRC provides information regarding patency of hiliar vascular structures, presence of nodal and distant metastases and of lobar atrophy. There are a few limtations including cost, availability, operator dependence, patient tolerance and representation. In biliary tract tumors experience with 3D- CT based visualizations is rare. CT based 3D- visualization meet above mentioned
75 requirements. Quality and accuracy in the accomplished investigations in our opinion was superior to MRC. CT based 3D- visualizations provide improved tumor localization and a virtual view of vascular structures within the liver hilum and the hepatoduodenal ligament. 3D- mapping probably allows meticulous tumor dissection and effective protection of crucial vascular structures. In hilar bile duct tumors preoperative visualization of the course, the caliber and the branching of bile ducts will facilitate intraoperative identification for bilidigestive reconstruction. Application of the procedure is limited due to possible side effects of the contrast medium’. The procedure however may be indicated in cases of non-obstructed bile ducts (living donor liver candidates) or in patients with contraindications to PTC and MRC. Living Donor Liver Transplantation: 3D- CT based visualizations achieved increasing acceptance in the work-up in living donor liver transplantation. The complex anatomy of the liver with high incidence of vascular variants reinforces the necessity for accurate preoperative vascular imaging. Up to one-third of potential donors may not be eligible for the procedure because of unsuitable vascular anatomy’. CT based 3D- visualization is non- invasive. Our data presented suggest that the method has achieved a robust standard. The procedure gives essential and detailed information about: variants of the hepatic artery, origin of the artery to segment IV, anatomy of portal and hepatic veins, variants of the biliary tract. In addition the liver volume can be calculated. In our opinion the results of 3D- visualization of CT- based cholangiograms were as good as those by ERC and seem to be superior to MRC- scans. According to our experience preoperative interactive simulation of the splitting line in the donor liver is of major value because it identifies “areas at risk”. These are margins along the splitting line of potential arterial devascularization or venous congestion. In conclusion, 3D- CT based visualization seems to be a valuable tool in order to perform this surgical procedure with high accuracy and to minimize potential risks to the donor and the graft.
References 1.
2.
I.R. Kamel, J.B. Kruksal, E. A. Pomfret, M.T. Keogan, G. Warmbrand and V. Raptopoulos, Impact of multidetector CT on donor selection and surgical planning before living adult right lobe liver transplantation. AJR 176: 193 (2001) H. Bourquain, A. Schenk, F.Link, B. Preim, G . Prause and H - 0 . Peitgen, HepaVision2-a software assistant for preoperative planning in livingrelated liver transplantation and oncologic liver surgery. In CARS 2002; Computer-Assisted Radiology and Surgery. Proc of the 16th International
76
Congress and Exhibition, Paris June 26-29;2002, Lembke HW (eds), Leiden University Medical Center, vol 1: 341 (2002) 3. B. Preim, H. Bourquain, D.Selle, H-0. Peitgen and KJ. Oldhafer, Resection proposals for oncologic liver surgery based on vascular territories. In CARS 2002; Computer-Assisted Radiology and Surgery. Proc of the 16" International Congress and Exhibition, Paris June 2629;2002, Lembke HW (eds), Leiden University Medical Center, vol 1: 353 (2002) 4. J. Baulieux and J.R. Delpero, Surgical treatment of pancreatic cancer: curative resections. Ann Chir. 125609 (2000) 5. M. Schwarz, S.Pauls, R.Sokiranski, H.J.Brambs, B.Glasbrenner, G.Adler, C.G.Dierichs, S.N.Reske, P.Moller and H.G.Beger, Is preoperative multidiagnostic approach to predict surgical respectability of periampullary tumors still effective? Am JSurg. 182:243 (2001) 6. L.H. Blumgart, Y. Fong, Biliary tumors; in LH Blumgart, Y Fong (eds) Surgery of the liver and biliary tract, W.B. Saunders Co Ltd, GB, Vol. 1:1021 7. D.J. Ott and D.W. Geland, Complications of gastrointestinal radiologic procedures 11. Complications related to biliary tract studies. Gastrointest Radio1 6:47 (1981) 8. T.C. Winter, P.C. Freeny and H.V. Nghiem, Hepatic arterial anatomy in transplantation candidates: evaluation with three-dimensional CT arteriography. Radiology 195:363 (1995)
ANALYSIS OF DRILL SOUND IN SPINE SURGERY” I. BOESNACH, M. HAHN, J. MOLDENNAUER, TH. BETH Institut fur Algorithmen und Kognitive Systeme, Universitat Karlsruhe (TH), Am Fasanengarten 5 Karlsruhe, 76128, Germmy U. SPETZGER Neurochirurgische Klinik, Klinikum Karlsruhe, MoltkestraJe 90 Karlsruhe, 76131, Germany
Latest computer systems for surgical planning and navigation try to assist the surgeon in thc operation theatre. The data used for this purposes only stem from medical imaging. However, important additional information like drill sound characterizing bone dcnsitics should be regarded, too. Therefore we show the integration of suitable microphones in the operation theatre, the acquisition of sound data, and methods to process these data. Especially, we present methods to analyze the sound data based on neural networks, support vector machines, and hidden markov models. First analysis results prove the capability of the new methods.
1.
Introduction
One challenging task in spine surgery is to drill in a vertebra with high accuracy, e.g. to place pedicle screws. Especially, essential vertebra arteries and the spinal chord must not be touched. During the drilling process blood, surgical instruments, and a small access to the spine worsen the view of the surgeon. To maintain accuracy however, the surgeon needs a good 3D imagination based on anatomical knowledge from pre- and intraoperative image data. Additionally, the surgeon gets haptic and acoustic feedback by the drilling device during the operation. Particularly, the sound generated by the drill provides significant information about tissue. Transitions between areas of different bone densities are highly correlated with the change of drill sound. This information is independent from current navigation data. Thus, it is a powerful add-on which gives information as soon as the navigation system should be improper or even fail, e.g. if problems arise from bad initial calibration or shifts of vertebrae in situ.
* This work is supported by the Deutsche Forschungsgemeinschaft(DFG)
within project “ComputergestiitztePlanung und Navigation neurochirurgischer Eingriffe an der Wirbelsaule”. 77
78 2.
Integration in the operation theatre
To acquire proper data during a surgery it is important to minimize the interference with the surgeon. Thus, we mounted a room microphone on the lamp and fixed a boundary layer microphone to an allis forceps and clamped it directly to the vertebra (see Fig. 1). The condenser microphone CCM 2 by Schoeps and the C 562 BL by AKG met our requirements of frequency response and signal to noise ratio. Both were connected to a Duo USB by M-Audio that records sound data with 96 kHz sampling rate and 24 bit resolution. With this configuration we acquired data sets from 4 drillings from 8 vertebrae of a human spine preparation.
Figure 1. Left: room and vertebra microphone. Right: fixation of the vertebra microphone.
3.
Generation of Audio Features
Since high frequencies are required for a reliable sound classification the audio data is recorded from the two microphones with the specified parameters. Then the audio spectrogram is computed by short time Fourier transform with a Blackman window of 1024 samples (10.6ms). From the frequency data we generated a vector of 13 features for each sample window: volume, median frequency, bandwidth, and energies of 10 subbands over the entire spectra. Additionally, we generated 5 features directly from the audio signal: zero crossing rate, pitch, and 3 cepstral coefficients for each window of 1024 samples. The pitch is calculated by autocorrelation [l] and the cepstral coefficients by the Matlab toolbox Voicebox [2]. We analyzed the audio data and manually divided it into 9 classes (see Tab. 1). To get training data for neural networks and support vector machines, the feature vectors of all acquired data sets were computed with a window shift of 2048 samples and normalized to the interval [0,1] by linear mapping.
79 Table 1: Types of drill sound, manually classified. classes
Y Y
4
5
descri tion unknown idle drilling pause drilling in bone: outer vertebra drillin in bone: inner vertebra
classes
descri tion drilling in bone: scraping V drilling in bone: strong scraping perforation vertebra microphone touched Tr transition from class V to class P
7 9
For testing purposes, sequences of feature vectors were generated with the same normalization and window shift but starting with sample 1025, so that there is no overlap between training and testing data. For a second test, feature vectors of a selected drilling were identically normalized but shifted by 1 sample. Figure 2 shows the manual classification of this data set.
li/
:J-
2
't
4
1
5
4
10
'11
Tim- IS,
Figure 2. Manual classification of the selected data set (target output for training).
4.
Methods for Sound Classification
4.1. Neural Networks
Neural Networks (NN) are a well known technique in various fields of application including pattern recognition, identification and classification [3]. They consist of simple nodes and connections between these nodes. Each node calculates an output from its weighted inputs and bias. Given a particular input for the network, the weights and biases can be adjusted in order to adapt the network's output to a given target output by using the backpropagation algorithm [4]. Properly trained networks tend to give reasonable answers for inputs that they have never seen. For the classification of different tissues from the drill sounds, we did several tests with multilayer feedforward networks. The 36 audio features described in Section 3 were used as input data and the target output was represented by an 1 out of 9 vector according to the manually classified states of the drill sound. All network elements in this work use the log-sigmoid transfer function. In all
80
cases the training of the network was performed with the backpropagation algorithm in batch mode. In this work, we designed two NNs with two layers each. The first one was created with 20 and 9 neurons and trained 20 times with 10 cycles each. The second one consists of 50 and 9 neurons and was trained 10 times with 10 cycles per batch. Therefore, the features were supersampled with offset 0 and step size 64. Since some experiments with three layered networks did not show a better performance we consider two layers as sufficient for the classification of drill sounds. Table 2. Recognition rates P(i,j ) = P(output =j I target = i) of neural network with 50/20 neurons for all data sets and classes E { 1 ..9}. J = 1 2 rn i = 1 0.577 0.027
2
3 4
5 6
0.136 0.041 0.056 0.010 0.002
~~
3 4 5 6 0.010 0.207 0.150 0.013
0.437 0.024 0.194 0.110 0.137 0.137 0.019 0.007 0.760 0.001 0.000 0.027 0.002 0.002 0.007
0.170 0.411 0.108 0.933 0.229
0.034 0.123 0.022 0.024 0.702
7 0.013 0.005 0.000 0.008 0.001 0.037
8
9 0.000 0.000 0.000 0.000 0.000 0.004
I 8
9 -
After the training, the networks were tested using different testing data (see Sect. 3). Table 1 shows the probabilities P(i,j ) for a feature vector labeled as target-class i and the network calculating the output to be classj. The overall hit rate of the 50-9 network is 77,9 % compared to 74,8 % of the 20-9 network. Note that class 3 ‘pause’ and class 5 ‘dnlling in inner vertebra’ are very hard to separate with our model. However, this is not critical for our application. The classification of the selected data set from Section 3 yields 72.07 % (see Fig. 3).
0
e.
2 Tim*
10
IS,
Figure 3. Neural network output for the selected data set with 72.07 ‘YOcorrect classification.
I
81
4.2. Support Vector Machines
Another classification method are support vector machines (SVM) which separate data xi by an optimal hyperplane w / + ( x )+ b = 0 in the two classes w / + ( x i ) + b < 0 and w / + ( x i )+ b > 0 . The function maps the original data x, to a higher dimensional feature space. For the mapping, we choose the RBFkernel K ( x i, x i ) = +(x,)' $ ( x j ) = exp(-y "xi- x j with y> 0. Details about SVMs can be found in [5]. The implementation is based on the toolbox LIBSVM [6] which allows a multi-class classification by one-against-one. With the grid-search method [6] we determined y = 1.95 and a penalty constant C = 8150 for training a good SVM. We trained a SVM considering all 36 feature vectors and two SVMs considering the 18 feature vectors for each microphone for comparison. With the small SVM we achieved a classification rate of 79.5 % (vertebra) and 72.3 % (room). However, the 36 feature SVM yields a classification rate of 85.2 % for all data (see Tab. 1). Using both microphones yields better results than only one microphone. For the selected drill data (see Fig. 2 ) we achieved a correct classification for 88.8 % of the samples (see Fig. 4). Although the recognition rate for the tested drilling data is higher with SVMs than with NNs, we think, a hybrid combination of different classification methods is a reasonable procedure.
+
[I2)
Table 3. Recognition rates of the SVM for all data sets (see Table 2).
-
j = 1 2 m i = 1 0.713 0.073
3 4 5 6 7 0.010 0.107 0.090 0.000 0.000
8 9 O.OO0 0.007
2
3 4 5 6
7 8 9
1
0.000 0.012 0.000 0.000 0.000
e
2
,o
I
TlrnC LS,
Figure 4.SVM output for the selected data set with 88.8 % correct classification.
82 4.3. Hidden-Markov-Models
Neural networks and SVM only consider single feature vectors for analysis purposes. Unlike these classification methods Hidden-Markov-Models(HMM) use temporal sequences of features O = o,o*...oT to analyze data. We use HMMs with N states, an initial state distribution x, and a transition matrix A. For the purpose of processing vector-valued features 0, we use continuous HMMs with Gaussian mixtures ’J1 with weights c., The mixture parameters p,, and U,, are expectation vectors and covariance matrices. Details about HMMs can be found in [7]. The implementation of our methods is based on the BNT Matlab toolbox [8]. To classify tissue values by dn11 sounds we constructed several HMMs ,?.,,...,AL. Each HMM represents either a certain tissue class or the transition from one tissue to another. Given an observation sequence 0 a distinction of tissues respectively a classification is done by determination of the model which fits the observation best. How well a model fits an observation is determined by the probability P(0lil) from the forward-algorithm. Tests have shown that the usage of frequency spectra of T time steps directly received from the frequency analysis (see Section 3) seem to be sufficient for the feature vectors 0,. However, a restriction to dedicated frequency bands might be helpful to reduce calculation costs in future works. Initially, the probabilities in IC are uniformly distributed. Matrix A is a diagonally dominant and stochastic matrix. To train an HMM A for a certain tissue class we use sequences 0 of corresponding ranges in training data sets and adjust the parameters A, B, and x by expectation maximization (EM). The initial mixture parameters p,, and U,, are determined by the expectation vectors and covariance matrices of the training data. The initial mixture weights cJmare uniformly distributed. All mixture parameters p,, , UJm, and c,, are also refined during the EM steps. Figure 5 shows the logarithmic probabilities respectively the classification results of three HMMs 5 , A,,, & for the selected data set (cf. Tab. 1 and Fig. 2 for the corresponding classes). Considering P(0I A,, ) and P(0I A,,) we can evidently tell ‘drilling in the vertebra’ (class V) from ‘perforation’ (class P). Since hr is trained to recognize the transition between these two classes the significant peak for P(0I hr) additionally characterizes the point when the drill enters the spinal canal at risk. We performed several tests to work out proper number of states, lengths of observation sequences, and number of mixtures. A larger number of states N does a better separation of the probability curves. Longer observation sequences O=o,o,...0, yield smoother runs of the curves. Larger numbers of
83
mixtures do not produce major changes in the curves. To get a trade-off with higher calculation times we used N = 18 states, T = 240 samples (equals 2.5 ms), and M = 1 mixtures for the classifications presented in this paper.
2
e
4
rim-
10
m
Figure 5. The distinction of the classes Vand P by log(P(0I A,, )) and log(P(0I ,Ip)) and the localization of the transition from class Y to P by log(P(0J 4, )) in the selected drilling data set.
5.
Results and Conclusions
To our knowledge, the consideration of h l l sound has not been studied by other research groups, yet. Therefore, we could not rely on existing results. This work describes our first steps to integrate suitable microphones in the operation theatre and to analyse the acquired sound data with different methods. The presented results show acceptable classifications of sound classes by NNs, SVM and HMMs. However the methods have to be refined and evaluated with data from other drillings. The computational time of the algorithms is relatively low and thus applicable for real time processing. The combination of all three classification methods will improve the quality of the results. Further work will focus on unsupervised learning strategies for NNs. Another interesting approach will implement recurrent networks capable of recognizing temporal patterns, as well as spatial patterns (e.g. Elman networks). Further on, the resulting recognition rates have to be compared to the other classification methods in this paper. To obtain more precise training data classification the position of the drill bit needs to be tracked. This can be achieved by registration of drill position with preoperative tomographic data. The long time objective of our work is the development of automatic realtime methods to analyse drill sounds in spine surgery. There is only small effort to integrate the system in the operation theatre and the purchase costs of a final system will be low. The proposed analysis methods are an important step to augment the view of the surgeon in minimal invasive surgery. We expect that
84
physicians will profit fiom warning mechanisms based on the sound analysis. Additionally, the methods can be used for the training of unskilled surgeons.
Acknowledgements The authors want to thank Prof. Dr. med. A. Prescher (Institut fur Anatomie I, RWTH Aachen) for the spine preparation, Prof. Dr. med. P. Reimer (Zentralinstitut fur Bildgebende Diagnostik, Stadtisches Klinikum Karlsruhe) and his team for providing us with radiological data, Dr. med. Z. A1-Fi1 (Klinikum KarlsbadLangensteinbach) for his inspiring discussions, and B. Vollmer (Schoeps Schalltechnik Karlsruhe) for his technical support.
References 1. Davy, M. and Godsill, S. J. Audio Information Retrieval: A Bibliographical Study. Technical Report. Signal Processing Group, Cambridge University, 2002. 2. Brookes, M. VOICEBOX: Speech Processing Toolbox for MATLAB. http://www.ee.ic.ac.uk/hp/staff/dmb/voicebox/voicebox.html 3. DARP Neural Network Study, Lexington, MA: M.I.T. Lincoln Laboratory, 1988. 4. Rumelhart D., Hinton G., Williams R., Learning internal representations by error propagation, in Parallel Data Processing, Vol. 1, Chapter 8, the M.I.T. Press, Cambridge, MA 1986 pp. 3 18-362. 5 . Osuna, E. E., and Freund, R. and Girosi, F. Support Vector Machines: Training and Applications. A1 Memo 1602, Massachusetts Institute of Technology, 1997. 6. Chang, C.-C., and Lin, C. J. LIBSVM - A Library for Support Vector Machines. http://www.csie.ntu.edu.tw/-cjlidlibsvm 7. Rabiner, L.R.: A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proc. of the IEEE 77(2), 1989, pp. 257-286. 8. Murphy, K.. Bayes Net Toolbox for Matlab. http://www.ai.mit.edu/-murphy/Software/BNT/bnt.html
EXPERIMENTAL NAVIGATION SETUP FOR CORONARY INTERVENTIONS
J. BORGERT, s. KRUGER, R. GREWER Philips Research Laboratories Sector Technical Systems Rontgenstrasse 24-26, 2231 5 Hamburg, Germany E-mail: joern.
[email protected] H. TIMINGER Department of Measurement, Control, and kfacrotechnology University U h , Albert-Einstein-Allee 41, 89081 Ulm, Germany We will present an experimental setup for the development and benchmark of algorithms and applications for coronary interventions, using non-line-of-sight motion compensated navigation on coronary 3D roadmaps. We analyze the setup regarding the reproducibility of the included heart phantom’s motion due to heart beat and respiration.
1. Introduction In this article we will introduce an experimental setup for the develop ment, analysis, and benchmark of algorithms for navigation on virtual 3D roadmaps. After the introduction into the application area in Sec. 2, a description of the components of the setup will be given in Sec. 3. The analysis of the reproducibility of the heart phantom’s motion due to heart beat and respiration is described in Sec. 4. A brief overview of key experiments for interventional navigation is given in Sec. 5 , followed by conclusions and an outlook in Sec. 6. 2. Application Area and Background
In the United States, the probability at birth to die of Cardiovascular Disease (CVD) is 47%, surpassing cancer, which amounts to 22%. Among CVD Coronary Heart Disease (CHD) is responsible for 54% of the deaths. Taking this together, one in four persons dies from CHD (data from American Heart Association’). 85
86
The generally accepted standard treatment for CHD is the introduction of one or more Coronary Artery Bypass Grafts (CABG) by means of open surgery. Over the last years, interventional procedures like Percutaneous Coronary Interventions (PCI) have replaced a great part of these time consuming and risky invasive procedures. A typical, idealized PCI may consist, but is not limited to the following steps: The patient will undergo Diagnostic Cardiac Catheterization (DCC). In case of a stenosis, the therapy will be planned on the basis of an angiogram. A Percutaneous Transluminal Coronary Angioplasty (PTCA) is performed. During or after almost all PTCA procedures, a stent is placed t o reinforce the renewed free lumen. The decision about the type and size of this possible stent will be based on the dimensions of the stenosis, deduced from the angiogram mentioned before. The result is checked; unsatisfactory results may lead to a reiteration of parts of the procedure. Although this is less endangering to the patient than open surgery, it still bears disadvantages: In the case of PCI, the position of a catheter in the coronaries is determined on the basis of their greyvalue in fluoroscopic images. This leads to X-ray exposure for the patient, and more importantly for the practitioner, who performs numerous interventions throughout his whole career, cf. Dash and Leaman2 and Miller and Castronovo3. On top of that, every visualization of the vessel structure implies the injection of contrast agent, because the natural greyvalue contrast of the vessels is not sufficient to visualize them. This puts an additional burden on the patient, because the contrast agent is toxic in high doses. Additional problems are introduced by the use of 2D images: Effects like foreshortening and overlap complicate the determination of the dimensions of a given lesion, which is important t o determine the severity and to select the right device, e.g. an appropriate stent. In addition, it is sometimes difficult t o determine the correct position of the interventional devices in relation t o the vessel structure, thus complicating the interventional navigation. A lot of these problems can be overcome by the introduction and use of 3D images, acquired by 3D-X-ray, CT, or MR prior to the intervention. By use and extraction of the vessel structure, which is then called a roadmap, the practitioner can generate arbitrary visualizations and 3D models for the diagnostic process, e.g. the determination of lesion dimensions. He can furthermore use the roadmap for therapeutics, by relating the position of the interventional devices to the 3D roadmap, thus saving both contrast agent and X-ray dose, because repeated injection of contrast agent
a7
and subsequent imaging is dispensable. For this, the position of the devices has t o be determined by means of so-called non-line-of-sight localization systems, which perform spatial measurements of position and orientation without the usage of an imaging system. For the correlation of the position and orientation information to the static 3D roadmap, however, the measurements have to be compensated for heart, respiratory, and patient motion as well as metal disturbances, originating from the setup, e.g. the X-ray system itself. Details on motion compensation for interventional navigation on 3D static roadmaps based on an affine model for respiratory motion compensation and methods based on an analysis of the cardiac cycle and gating of the QRS complex can be found in Timinger et a14. Refer to Solomon et a15 for a previous approach including animal trials without motion and metal compensation. 3. Components of the Experimental Setup
This section describes the components of the experimental setup and their interplay, as depicted in Fig. 1. It includes a standard image intensifier equipped X-ray system (Integris Allura, Philips Medical Systems, Best, The Netherlands), a dynamic pneumatic heart phantom, a non-line-fsight localization system (Aurora, Northern Digital Inc., Ontario, Canada), and a main computer system to integrate the various sources of data and to visualize the application.
Figure 1. Overview of t h e components of the experimental setup and their interplay (left) and view of t h e setup in an X-ray proof box (right).
88
Figure 2. A typical visualization realized within the experimental setup. The virtual roadmap together with cross sections extracted from the 3D dataset, is forming an orthoviewer, which is controlled by and moves along with the catheter.
Currently, the use of the X-ray system is twofold: 3D image acquisition (rotational X-ray, reconstructed using an approach of Grass et a16) of the contrast agent filled coronaries, which are used as source data for the extraction of the coronary roadmap (segmentation approach of Lorenz et a17) and provision of real time 2D image information. The images are overlayed with the position of the interventional devices, and provide a familiar view on the application for the practitioner. The heart phantom is used to simulate the coronaries motion due to heart beat and respiration. The motion patterns are controlled by a sep arate computer system, consequentially an ECG signal is the only source of information about the status of the heart beat provided to the main application. This ensures that no prior knowledge about the state of the heart phantom is being passed on to the algorithms dealing with motion compensation. The non-line-of-sight (NLOS) localization system, also called magnetic tracking system (MTS), is used to perform the position and orientation measurement of the interventional devices. The system consists of a main unit, a field generator, and trackable devices, like catheters, biopsy needles, and alike. The main computer system carries out all the real-time interfacing, integration, and processing of the according data, like real-time X-ray images, visualization of the virtual 3D roadmap, the integration and overlay of the measured position of the trackable catheter, registration of the individual
89
coordinate systems, and the processing of the volumetric data set, e.g. for slice extraction and following in accordance with the catheter position. Figure 2 shows a typical visualization realized within the experimental setup: The virtual roadmap is visualized by a triangulated structure (triangulation performed with techniques described in Lorenz et al'.) The catheter is visualized by a 3D icon, indicating the exact position of its tip and orientation. The cross sections intersect at the position of the catheter in the coronary tree and carry an orthogonal reconstruction of the 3D dataset, thus forming an orthoviewer, which is controlled by and moves along with the catheter.
4. Reproducibility of the Heart Phantoms Motion
In this section, the analysis of the reproducibility of the motion of the coronaries due to heart beat and respiration is presented. Throughout the analysis it is presumed that the individual motions due to heart beat and respiration, and thus the according reproducibilities, are independent. The heart rate was varied from 60 to 160 beats per minute, the respiratory cycle length from 6 to 2 seconds. Over a time of 30 seconds, about 1200 individual position measurements were taken, spanning about 30 to 80 complete cycles for heart beat and 5 t o 15 complete cycles for respiration. A single heart beat- and respiratory cycle is subdivided into 100 sub-phases. Using the ECG signal and a respiratory sensor input, the position measurements were associated with the according sub-phases, leading to about 12 measurements per sub-phase. The resulting reproducibility is then given by the standard deviation averaged over the individual subphases. Each of the individual measurements is carried out ten times, leading t o a mean, as well as a minimal and maximal value, as shown in Table 1 and depicted in Fig. 3.
Table 1. Reproducibility of the phantom's motion due to heart beat (left; heart rate in bpm) and respiration (right; respiratory cycle length in seconds).
60
80
100
120
140
160
1
6
4
3
2
min
0.39
0.50
0.61
0.64
0.72
0.71
0.11
0.12
0.14
0.15
mean
0.44
0.60
0.75
0.74
0.85
0.87
0.15
0.16
0.17
0.20
max
0.53
0.80
1.02
0.87
0.94
0.92
0.17
0.22
0.20
0.24
90 heart r a t e
60
80
100
120
140
160
I .
0.8
I
r"""ry
8
o.6t/
6
I
0.4
10. 4
6
4
3
2
r e s p i r a t o r y cycle length
Figure 3. Reproducibility of the heart phantom's motion due to heart beat (top; heart rate in bpm) and respiration (bottom; respiratory cycle length in seconds).
The analysis of the reproducibility of the motion due to heart beat showed a strong dependency on the heart rate. A heart rate of 60 beats per minute results in a reproducibility of about 0.44mm, whereas a heart rate of 160 results in a reproducibility of 0.87mm. On the contrary, the reproducibility of the motion due to respiration showed only a limited dependency on the respiratory cycle length. It should be noted, that above a heart rate of 140 bpm and for a respiratory cycle length of 2 seconds and below, the motion cycles became incomplete because of the large mechanical inertia of the ventricular phantom. Therefore it does not seem advisable to use heart rates above 120 bpm and respiratory cycle length below 3 seconds. These, however, show a reproducibility of about 0.74mm and 0.17mm. Both values are well below the diameter of coronaries, which are subject to interventions (lmm). According to Dodge et a19, only the distal parts of the LAD, the L4d, typically has a diameter below lmm (data for normal, right-dominate male). The ability to tell the position of the devices within the diameter of the respective vessels ensures the exact correlation of the device to the roadmap. 5. K e y Experiments for Interventional Navigation
This section lists the two key scientific challenges faced in interventional navigation in the cardio-vascular context. Before carrying out experiments in a clinical environment, one has to prove technical feasibility using means like the presented experimental setup, to decrease the exposure of an patient t o risks.
91
Metal Compensation The need for metal compensation is introduced in interventional navigation due to the use of non-line-f-sight localization systems based on electromagnetic fields. Every occurrence of metal in the environment, especially moving parts, e.g. the X-ray system itself, will cause disturbances on the electromagnetic fields generated by the tracking system. The presented experimental setup allows to analyze such influences on the accuracy of the tracking. Different strategies and algorithms for compensation can be studied in great detail, while the X-ray system can be moved into every desired position. Motion Compensation Every position and orientation measurement in the coronaries will be superimposed by their motion due to heart beat and respiration. This combined motion has to be separated from the actual push and pull motion of the interventional device to regain the correct position and orientation that can be related t o the pre-acquired roadmap. As in the former case, the experimental setup allows to analyze different compensation strategies and algorithms combined with different motion patterns realized by the dynamic heart phantom in great detail. 6. Conclusions
We presented the components and interplay of an experimental setup for navigation for coronary interventions. The setup allows for the detailed study of algorithms for metal and motion compensation in the cardiovascular context, as well as different applicational approaches for interventions, like percutaneous transluminal coronary angioplasty and stenting. The basic analysis of the reproducibility of the motion of the artificial coronary arteries due to heart beat and respiration showed clear dependence on the heart rate, but limited dependence on the length of the respiratory cycle. Especially the latter has to be reevaluated, if a transition into a clinical context has been made, and realistic respiratory sensors, like bellows systems, are considered. Typical results for both types of motion are: 0.74mm reproducibility at a heart rate of 120bpm for motion due to heart beat, and 0.16mm reproducibility at a respiratory cycle length of 4s for motion due to respiration. Both results, and the resulting over-all reproducibility, are well below the diameter of coronaries, which are subject to interventional treatment.
92 First promising results in the area of motion compensation have been achieved and published in Timinger et a14. T h e quantitative analysis is currently extended to include metal compensation. As it turns out, both are promising enough, that animal experiments and transition of the setup into a clinical context are the next steps to be considered.
7. Acknowledgments We thank Prof. 0. Dossel (University of Karlsruhe, Germany) for the collaboration on a pneumatic heart phantom, which served as a basis for the phantom used in this paper.
References 1. American Heart Association; Disease and Stroke Statistics 2003 Update; http://www.americanheaxt.org 2. H. Dash, D. M. Leaman; Operator radiation exposure during percutaneous trans1umina.l coronary angioplasty; JACC 4 725-728 (1984) 3. S. W. Miller, F. P. Castronovo; Radiation exposure and protection in cardiac catheterization laboratories; A m J Cardiol55 171-176 (1985) 4 . H. Timinger, S. Kriiger, J. Borgert, R. Grewer; Motion compensation for interventional navigation on 3D static roadmaps based on an affine model and gating; Phys. Med. Biol. 49 719-732 (2004) 5. S. B. Solomon, T. Dickfeld, H. Calkins; Real-time cardiac catheter navigation on three-dimensional CT images J. Interventional Cardiac Electrophysiology 8 27-36 (2003) 6. M. Grass, R. Koppe, R. Proksa, M. H. Kuhn, H. Aerts, J. Op de Beck, R. Kemkers; Three-dimensional reconstruction of high-contrast objects using C-arm image intensifier projection data; Comput. Med. Imaging Graph. 23 311-321 (1999) 7. C. Lorem, T. Schlatholter, I. C. Carlsen, S. Renisch; Efficient segmentation of MSCT images by interactive region expansion; Computer Assisted Radiology and Surgery (CARS 2002) 407-412 (2002) 8. C. Lorenz, N. Krahnstover; Generation of point-based 3D statistical shape models for anatomical objects; Comput. Vis. Image Underst. 77 175-191 ~
(2000)
9. J. T. Dodge, B. G. Brown, E. L. Bolson, H. T. Dodge; Lumen diameter of normal human coronary arteries; Circulation 86 232-246 (1992)
BEATING HEART TRACKING IN ROBOTIC SURGERY USING 500 HZ VISUAL SERVOING
R. GINHOUX, J. A. GANGLOFF, M. F . DE MATHELIN LSIIT U M R 7005 CNRS Louis Pasteur University, Strasbourg I Bd. Sibastien Brant, B P 10413, F-67412 Illkirch Cidex, France Email: { ginhoux,jacques, demath} Oeavr.u-strasbg.f r L. SOLER, MARA M. ARENAS SANCHEZ AND J. MARESCAUX IRCAD / E I T S , European Institute of Telesurgery University Hospital of Strasbourg, 1, place de l’h6pita1, F-67091 Strasbourg Cidex, France Email:
[email protected] This paper presents first in-vivo results of beating heart tracking with a surgical robot arm in off-pump cardiac surgery. The tracking is performed in a 2D visual servoing scheme using a 500 frame per second video camera. Heart motion is measured by means of active optical markers that are put onto the heart surface. Amplitude of the motion is evaluated along the two axis of the image reference frame. This is a complex and fast motion that mainly reflects the influence of both the respiratory motion and the electro-mechanical activity of the myocardium. A model predictive controller is setup to track the two degrees of freedom of the observed motion by computing velocities for two of the robot joints. The servoing scheme takes advantage of the ability of predictive control to anticipate over future references provided they are known or they can be predicted. An adaptive observer is defined along with a simple cardiac model to estimate the two components of the heart motion. The predictions are then fed into the controller references and it is shown that the tracking behavior is greatly improved.
1. Introduction
Motion compensation devices for computer-assisted surgery have recently attracted a growing interest in the research community as well as in the everyday clinical use. From specific mechanical systems developed to account for the patient’s breathing motion in laser [l] or radio-frequency surgery [2,3], to computer algorithms for the analysis of the heart motion in thoracic surgery [4,5],motion tracking and compensation appears to be the key of more accurate and safer interaction of robots with patients. Recent works in cancer therapy [3] deal with the detection and compen93
94
sation of breathing-induced motions of soft tissues. The aim is to deliver the radiations as close as possible to the tumor thereby avoiding the sane parts. Automatic robotic systems aim at correcting the tumor motions during the treatment [2]. A visual servoing scheme for the synchronization of a robot with the breathing motion of a liver in laparoscopic surgery was described in [6]. Beating heart surgery, particularly coronary artery bypass grafting (CABPG) is maybe the most challenging task for robots today [7,8]. Motion of the heart limits the use of traditional approaches. It is complex and fast and it is due to both the respiratory motion and the electromechanical activity of the heart. The difficulty is actually twofold. On the one hand, an appropriate sensor or measurement device is needed to precisely estimate the motion of the heart with its full dynamics. On the other hand, tracking the fast motion with a robot requires an adequate control law that should take into account the robot dynamics. A motion minimization device was shown in the first ever robot assisted CABPG [9]. It was proposed by Computer Motion, Inc. with the Zeus robot. The system consists of acceleration sensors that are put onto the heart surface to measure the movements and control the robot. A mechanical stabilizer may also be used to reduce the motion of the area of interest, but a residual motion (about 2 mm in amplitude) still remains that could be canceled [lo]. Groger and Ortmaier [4,5] have developed a robust vision algorithm that visually tracks movements of texture areas on the heart surface at 25 frames per second. The algorithm is used to predict futures references for a Zeus robot but the current limitations are the communication delay with the controller and the low frame rate that forbid real-time tracking with the robot. Thakral et al. [ll]have shown an adaptive model to predict heart displacements. The model was identified for one degree of freedom and measured with an optical fiber sensor, but no robot was considered in the application. Nakamura et al. [12] have investigated the use of a high-speed camera to precisely sample the heart motion. The motion is detected thanks to an artificial marker on the surface. It is measured as the variations of the marker’s coordinates in the image reference frame at 955 frames per second. A special small-size 2-DOF robot was installed on the chest wall to track the motion. The robot was controlled using H , control theory to put emphasis on the high frequency tracking requirement [13]. Results shown in a living pig experiment have successfully validated this high-speed visual servoing approach. In this paper the predictive control strategy is considered so that a surgical robot could track the beating heart in an open surgery scenario. The robot is controlled by visual servoing running at 500 Hz thanks to a 500 frame per second camera. Active optical markers are attached to the heart surface to allow for the measurement of the motion along the
95
Figure 1. View of the testbed.
Figure 2.
In wivo experiment.
two directions of the image reference frame. Predictive control requires a model of the robot dynamics. It is preferred to other strategies for its ability to filter random disturbances and to anticipate over futures references. A frequency-based heart motion model serves as a basis for the definition of an adaptive observer that is used to provide the controller with predicted future references. The experimental setup with the robot and video camera is described in the next section. The adaptive observer is defined in section 3. Results on a simulation setup and in vivo conditions are reported in section 4. 2. Robotic Setup Figure 1 shows the testbed used for laboratory experiments and figure 2 shows the in vivo setup.
96
2.1. Surgical robot The robot is a prototype of a surgical arm from the Sinters SA company in Toulouse, France. Its kinematics is similar to the Aesop’s one (Computer Motion, Inc.) but with no passive joint. The robot controller is hosted by a standard PC running the Real-Time Linux environment (RTLinux). We consider the control of the shoulder (q2) and elbow (43) joints for our experiments. The robot is holding a special instrument that projects a low-power laser beam along its main axis. A small light emitting diode (LED) is fixed at its tip as a visual marker. 2.2. Mechanism for visual measurement Heart motions are measured visually using a high-speed camera and smallsize LED% The camera is a DALSA CCD digital camera with 8 bits greyscale that is acquiring 500 frames per second (Fig. 1). It is mounted on a fixed rigid tripod. Four green LEDS are put on a small piece of tissue that recovers the heart surface. The image transfer is restricted to small Area of Interests (AoIs) around the LEDS’ images inside the 256 x 256 bitmap. The transfer runs at 100 MB/s through LVDS buses connected to a 2.4 GHz Intel Xeon bi-processor computer that runs RTLinux. The high frame rate makes it possible to reduce the available visual data to the active markers only, which is seen as an advantage in the strong lighting conditions of surgery. In the following, fo(k) will denote the image coordinates of the center of gravity of all the 4 LEDs at each time step; fi(k) is the instrument’s tip marker and f2(k) the image of the laser spot on the heart. Visual measurements of the heart motion are made with respect to 2 degrees of freedom as follows: f(Ic) = f2,(k) - f~,(Ic) and $ ( k ) = f 2 , ( I c ) - f o , ( k ) . 2.3. Control strategy
The purpose of the servoing is to get the robot instrument (laser spot f2) track the two visual measurements. A specific real-time driver was written for the frame-grabber linked to the high-speed camera. It allows to precisely synchronize image acquisition, feature extraction and computation of desired joint velocities q2+(Ic) and Ql(Ic)with the camera frame rate. Therefore, control signals are sent with a 500 Hz refresh rate to the joint power amplifiers, thanks to a high speed serial link running a t 18 Mbits/s. According to [14],this is a direct, velocitycontrolled, visual servoing scheme. The controller is a generalized predictive controller that is described in [15].
97
2
4
6
8
10
12
14
lime eaad5)
Figure 3. Example of the beating heart motion on a pig measured along the y image axis. The curve depicts the evolution of the c(k) visual measurement.
3. Adaptive Observer Figure 3 show recordings of the visual measurements when the robot controller is not running. Signal is clearly made of two periodic components with different periods. The fast variations with medium amplitudes are due to heartbeats. The high sampling frequency of 500 Hz is necessary to sufficiently sample these fast variations so that there is no aliasing in the numerical signals. The slow variations with a large amplitude are due to the influence of the respiratory motion. Medical monitoring systems were indicating F, = 15 min-’ (breathing frequency) and F, = 123 min-l (frequency of cardiac beats), which is equivalent to a period T, = 2000Te for the respiration and T, = 243T, for the beating. (T, = 0.002 s is the sampling period.)
3.1. Adaptive filtering
We propose to estimate separate contributions of breathing and heartbeats in the heart motion in order to provide a GPC controller with predicted references. A simple model made up of a few sinusoids can be used to model the beating part of the heart motion: S,(lc) M ai(lc) cos(2.rrlci/TC)bi(lc)sin(2.rrlci/Tc) where S,(lc) is the disturbance due to cardiac beating at the kth iteration and M is the order of the highest harmonic in the model. Let Sp’(lc) = ai(lc)c0s(2.rrki/Tc) - bi(lc)sin(2.rrlci/TC)refer to each individual component. The ai and bi are the model parameters, which can be estimated online via measurements of the whole heart motion and using the gradient descent algorithm described in [16].
xEl
98
3.2. Prediction
By subtracting S, to the total measured disturbance S , we obtain S, which is the disturbance due to respiration. So S ( k j), the predicted disturbance at time step k j ( j steps in the future) is given by S ( k j ) = Sc(k - T, j ) S,(k - T, j ) . These predictions are used in combination with the GPC controller to improve the rejection of these dist,urbances.
+
+
+
+ +
+
4. Results
4.1. Simulation An experimental simulation is first built with help of a pan-tilt head (picture in Fig. 1) and the setup described in section 2. Its two degrees of freedom are controlled by an independent computer. The signals that were measured on the beating heart (see Fig. 3) are set into the head controller so that the same motion is reproduced mechanically. Results of tracking are shown in Fig. 4. Before t = 12 s, the GPC is using the initial references of zero. The adaptive observer is switched on from t = 12 s to compute corrected references for the controller. The peakto-peak amplitude of the tracking errors is then clearly reduced by twice.
-20
I
I 5
10
15
20
urn. (Mmnds)
Figure 4. Tracking of the simulated beating heart motion, with a GPC and the adaptive observer. Curves show the d ( k ) measurement. The standard GPC is running before t = 12 s. The predictions are used starting from t = 12 s.
4.2. In-vivo Results
The beating heart of a living pig in open surgery is tracked using the setup of section 2. Experiments shown in this section are carried out with no
99
heart/lung machine and no mechanical stabilizer. The first table gives some statistics describing typical motions of the beating heart as viewed from the camera with no compensation (C1 and C2). Statistics for the residual tracking errors in steady state are given in the second table (Serie C3 is the original GPC, C4 with anticipation.) The mean, minimum, maximum, and variance of the error are computed with the number of samples in the second column. Gains on the variance with respect to the original motion is about 77 % for 3 and 80 % for y when using standard GPC. Use of the adaptive observer improves the tracking behavior with respect to a standard GPC controller. Gains on the variance is almost 65 % for 3 and 21 % for 5. As a consequence, Model Predictive Control in conjunction with adaptive filtering of the heart motion is a promising technique to track the beating heart in real-time with a surgical robot. Data
c1
Nber of samples 7546
3
c2
7183
3
Mean (pixel) 6,97 -19,24 2,62 -4,90
y 6
Data c3@GPC
Nber of samples 18707
3
5601
2
6 c4@GPC+ ADAPTIVE
6
Min. (pixel) -5,88 -31,25 -8,85 -19,25
Mean (pixel) -0,02 -0,Ol 0,09 -0,25
Max. (pixel) 28,58 -5,44 21,74 4,82
Min. (pixel) -10,55 -7,34 -7,87 -10,94
Var . (pixel’) 80,52 42,20 59,59 45,47
Max. (pixel) 12,65 7,60 7,68 7,Ol
Var . (pixel’) 17,58 8,57 5,98 6,82
References 1. L. Reinisch, M. H. Mendenhall, and R. H. Ossoff, “Precise laser incisions, corrected for patient respiration with an intelligent aiming system,” Lasers in Surgeq and Medicine, vol. 20, pp. 21Cb215, 1997. 2. A. Schweikard, G. Glosser, M. Bodduluri, M. J. Murphy, and J. R. Adler, “Robotic motion compensation for respiratory movement during r a d i e surgery,” Journal of Computer Aided Surgery, vol. 5, no. 4, pp. 263-277, 2000. 3. K. Sharma, W. Newman, M. Weinhous, G. Glosser, and R. Macklis, “Experimental evaluation of a robotic image-directed radiation therapy system,”
100
4.
5. 6.
7. 8.
9.
10. 11.
12. 13. 14. 15.
16. 17.
in Proc. of the 2000 IEEE Int. Conj. on Robotics and Automation (ICRA), 2000. M. Groger, T. Ortmaier, W. Sepp, and G. Hirzinger, “Tracking local motion on the beating heart,” in SPIE Medical Imaging: Visualization, ImageGuided Procedures, and Display. San Diego, USA: Mun, s. K., Feb. 2002, vol. 4681. T. J. Ortmaier, “Motion compensation in minimally invasive robotic surgery,” Ph.D. dissertation, Technische Universitat Miinchen, Germany, Mar. 2003. R. Ginhoux, J. Gangloff, M. F. de Mathelin, L. Soler, J. Leroy, and J. Marescaw, “Model predictive control for tracking of repetitive organ motions during teleoperated laparoscopic interventions,” in European Control Conference (ECC), Cambridge, Royaume-Uni, Sept. 2003. F. Robicsek, “Robotic cardiac surgery: Quo vadis?” Journal of Thoracic and Cardiovascular Surgery, vol. 126, no. 3, pp. 623-624, Sept. 2003. V. Falk, A. Diegeler, T. Walther, B. Vogel, N. Loscher, C. Ulmann, T. Rauch, and F. W. Mohr, “Endoscopic coronary artery bypass grafting on the beating heart using a computer enhanced telemanipulation system,” Heart Surgery Forum, vol. 2, pp. 199-205, 1999. H. Reichenspurner, R. Damanio, M. Mack, D. Boehm, H. Gulbins, C. Detter, B. Meiser, R. Ellgass, and B. Reichart, “Use of the voice-controlled and computer-assisted surgical system ZEUS for endoscopic coronary artery bypass grafting,” Journal of Thoracic and Cardiovascular Surgery, 1999. P. F. Griindeman, C. Borst, and E. W. L. Jansen, “Coronary artery bypass grafting without cardiopulmonary bypass: the utrecht ”octopus” tissue stabilizer,” Kardiol Pol, Polish Society of Cardiology, vol. 52, pp. 43-46, 2000. A. Thakral, J. Wallace, D. Tomlin, N. Seth, and N. V. Thakor, “Surgical motion adaptive robotic technology (s.m.a.r.t.): Taking the motion out of physiological motion,” in Proc. of the 4th Int. Conf. on Medical Image Computing and Computer-Assisted Intervention (MICCAI), Utrecht, The Netherlands, Oct. 2001, pp. 317-325. Y. Nakamura, K. Kishi, and H. Kawakami, “Heartbeat synchronization for robotic cardiac surgery,” in Proc. of the 2001 Int. Conf. on Robotics & Automation (ICRA), Seoul, Korea, May 2001. Y. Nakamura, H. Kawakami, and M. Okada, “Motion-cancelling robot system for minimally invasive cardiac surgery,” Journal of the robotics society of Japan, vol. 18, no. 6, 2000, in Japanese. S. Hutchinson, D. H. Hager, and P. I. Corke, “A tutorial on visual servo control,” IEEE k n s a c t i o n s on Robotics and Automation, vol. 12, no. 5, pp. 651468, Oct. 1996. R. Ginhoux, J. Gangloff, M. de Mathelin, M. Sanchez, L. Soler, and J. Marescaux, “Beating heart tracking in robotic surgery using 500 hz visual servoing, model predictive control and an adaptive observer,” in Proc. of the Int. Conf. o n Robotics and Automation, New Orleans, USA, Apr. 2004. M. Bodson, A. Sacks, and P. Khosla, “Harmonic generation in adaptive feedforward cancellation schemes,” IEEE k n s . on Automatic Control, vol. 39, no. 9, pp. 1939-1944, 1994. E. F. Camacho and C. Bordons, Model Predictive Control. London: Springer-Verlag, 1999.
OCCLUSION-ROBUST, LOW-LATENCY OPTICAL TRACKING USING A MODULAR SCALABLE SYSTEM ARCHITECTURE
A. KOPFLE AND R. MANNER Lehrstuhl fur Informatik V, Universitat Mannheim, B 6 23-29 C, 0-68131 Mannheim
M. SCHILL VRmagic GmbH, B6 23-29 C, 0-68131 Mannheim
M. RAUTMANN, P.P. POTT, M.L.R. SCHWARZ AND H.P. SCHARF Orthopadische Universitatsklinik Mannheim, Theodor-Kutzer- Ufer 1-3, 0-6816'7 Mannheim A. WAGNER AND E . BADREDDIN Lehrstuhl fur Automation, Universitut Mannheim, B6 23-29 C, 0-68131 Mannheim P. WEISER Institut fur CAE, Fachhochschule Mannheim, Windeckstr. 110, 0-68163 Mannheim
This paper describes the development of an advanced optical tracking system for use in image guided surgery (IGS). Current tracking systems have a number of limitations: they are sensitive to occlusion of the line of sight, thus blocking space next to the operating table, their latency is too high for demanding applications like robotics control or tremor compensation, their accuracy drifts during operating time and aging. We present a new approach to tracking systems, addressing these problems. The system works with an arbitrary number of "intelligent" camera modules reducing occlusion problems and allowing adaption to tracking tasks of different complexity. The camera modules each feature an integrated hardware for image processing enabling high update rates and low latency times. A dynamic online recalibration of the system will be performed to assure constant precision over time.
101
102
1. Introduction Navigation systems are in broad use in image-guided surgery (IGS) applications. The objective of the navigation systems is to support the surgeon, enhancing his or her capabilities, without negative influences to the work flow during operation. This is only partly fullfilled by todays systems due to the optical tracking devices that are used for navigation. The way these devices work leads to a number of unwanted restraints'. The tracking principle is based on optically measuring the positions of active (luminescent) or passive (IR-reflective) markers that are attached t o the object being tracked. The markers are imaged by two area-scan or three line-scan camera sensors. With this image data and knowledge of the camera system configuration (the camera parameters) one can reconstruct the 3-dimensional marker coordinates. However there is no redundancy: if the line of sight of just one camera is blocked tracking is no longer possible. Depending on the type of surgical procedure the tracking has to be placed at a suitable position immediately beneath the operation area. This position, normally available to assistants to the surgeon, can thus no longer be used in the usual way, making modified operation procedures necessary. In addition all participating personnel have t o pay attention not to interrupt the line of sight between cameras and tracked objects. The low update rates and high latency times of current optical tracking systems make them less suitable for demanding tracking tasks like robot surveillance or tremor control that need fast continuous 3-dimensional POsition data. Moreover tracking precision is subject to aging effects and also varies over running time of the device as recent studies indicate2. We present a new approach to tracking systems, MOSCOT (Modular Scalable Optical Tracking System), that addresses these problems. 2. Materials & Methods/Implementation
2.1. Overview MOSCOT uses a distributed setup of an arbitrary number of "intelligent" camera modules to introduce redundancy into the data acquisition process. The individual camera modules can be placed at arbitrary positions, e.g. at fixed positions at the ceiling of the operating theater, attached to an operating lamp, or temporarily at other suitable positions. The operating lamps are the best position as they normally have a clear sight to the situs and the illumination conditions for the cameras are well defined. By using different numbers and types of cameras the system can be flexibly
103 scaled t o tracking scenarios of different complexity. The system supports different camera types, using color cameras for color marker detection or IR-sensitive cameras for IR-reflective markers. Color markers are more difficult to segment from the environment under changing environmental lighting conditions, but are less sensitive to reflections and IR-interference and provide additional information e.g. t o distinguish between different instruments.
Figure 1. Schematic overview of the MOSCOT system.
All camera modules use fast image processing hardware resulting in high update rates and low system latencies. They are connected to a central control and reconstruction module that collects and merges the individual camera data and controls the system. The system is designed t o perform a dynamic online recalibration that will reduce accuracy problems caused by aging and temperature drift during warm-up. This is work in progress. We will first describe the current prototype and its setup and then give an overview of enhancements planned for the future. We present a first application to track a new handheld robotic manipulator. 2.2. C a m e r a Modules
In this first prototype system each camera module is composed of a commercially available camera with an attached proprietary image processing hardware. The hardware uses a Xilinxg Spartan FPGA” t o extract the marker positions from the camera image in real-time. The complete deaField Programmable G a t e Array - general purpose integrated circuit, that is programmmed for a certain task, which it then executes in hardware like a specialized processor
104
tection process encompasses an image segmentation based on color and luminance, an erosion filter to suppress interferences, a classification by size and shape and finally a center of gravity calculation for the detected objects. All these steps are performed continuously while the image data is read out from the camera thus eliminating the main cause of latency in a normal framegrabber, the copying of the complete image to a buffer before processing. The marker data - position and quality measure from the marker classification - is available just after the last pixels have been received from the camera. Only this data is then transmitted onwards to the central control module, reducing data bandwidth needs and preventing processing bottlenecks there. A Cypress FX2 Microcontroller handles the transmission of the data to the reconstruction module via USB serial interface.
Figure 2.
A tracking camera module.
Figure 3. Tracking setup with 3 camera modules.
In the present setup the FPGA processes data at a bandwidth of approx. 15MB/s (720x576 pixel at 50 interlaced frames per second). The FPGA is however fast enough to cope with much higher data rates up to 130MB/s (1280x1024 pixel camera with lOOHz full frame rate). For the 2nd generation prototypes we are currently testing faster digital CMOS camera sensors with 800x600 pixel resolution and lOOHz update rate. 2.3. Control 13Reconstruction Module
The central control and reconstruction module collects the data from the individual camera modules and reconstructs the overall scenery. The received data contains the markers’ position and a measure for detection quality, computed from marker area, size and shape. This quality measure is used in the reconstruction process to weight the redundant position data from different cameras. The reconstruction process is performed by trian-
105
gulation of the marker positions from the known camera view poses3. An optional Kalman filter filters the reconstructed marker positions to reduce noise, however this also enlarges the system’s response time. The filtering is optional because unfiltered data is better suited for tracking a robotic device where the robot’s control software performs the overall filtering of all sensor input data. During development the control & reconstruction tasks are done on a conventional PC connecting to the camera modules via USB. This allows good online control of the tracking and fast setup changes. A graphical user interface to monitor the tracking data in 2D and 3D has been developed for this purpose (see figure 4 & 5). Lateron when integrating the final version of the system the control & reconstruction tasks (without the user interface) will run on an embedded real-time system.
Figure 4. T h e application used t o calibrate the cameras and set the marker configuration.
Figure 5 . T h e graphical user interface for online monitoring of the tracking process.
2.4. Calibration
To do a 3D-reconstruction of the tracked objects the camera system has to be calibrated i.e. its configuration has to be known. This encompasses internal (focal length, lens distortion, etc.) and external parameters (pose: position and orientation) of the cameras. The MOSCOT system is calibrated by using standard camera calibration techniques imaging a calibration body with known properties. From these views the initial camera system’s calibration parameters are calculated. To assure constant precision during operation the tracking process has to be monitored for effects of warming and aging. For this purpose we will update the external calibration parameters of the system dynamically by
106
performing an online bundle adjustment optimization over all camera and marker data 5. 3. Results of First Test Setup
After integration of the major hardware and software modules first tests with the prototype system were performed. As the prototype’s camera modules don’t support IR-reflective markers yet a pointer tool with passive color markers was used. The scalability of the system was demonstrated by system setups with 2, 3 and 4 cameras at different locations. Robustness against occlusion was shown in a setup with 3 cameras. The line of sight of one arbitrary camera was interrupted while the tracking continued with a steady data stream from the remaining 2 cameras. We determined the latency of the system by toggling an active marker on and off and measuring the time till the reconstructed 3D-result was available. The overall time was 35ms. Of this time 25ms are camera specific latency (image read-out time of 18.4ms plus an exposure time of 6.7ms). This leaves approx. lOms as intrinsic latency time to the MOSCOT tracking system, composed of FPGA & FX2 processing time, data transfer via USB and the reconstruction time in the PC. For first accuracy measurements we tracked the pointer tool during rotation around its tip and measured the variation of the tracked marker path from an ideal sphere. This showed RMS errors of 1-1.5mm inside a tracked volume of approx. (0.5m)3. To show the performance of the MOSCOT tracking system in a real world scenario we used it successfully to track positions of ITD7, a newly developed handheld robotic manipulator, that compensates movements of the surgeon and aligns autonomously t o a planned target position. N
-
4. Discussion The results gained from our first experiments with the MOSCOT tracking system show that the modular scalable approach to system architecture is well suited to achieve the goals of occlusion robustness and low latency while providing flexibility for diverse tracking scenarios. Although the usage of multiple cameras eliminates the interception problem to a large extent, it cannot solve it completely. This is only possible if at least so many cameras are used that under all circumstances at least two cameras have sight of the tracked objects. However normally, the coordinates from more than just two cameras should be available.
107
Figure 6. Test stand with the MOSCOT tracking system and robotic manipulator ITD.
The image processing hardware is capable of achieving low system intrinsic latency of 10ms. This latency time measured from end of image read out from the camera to availability of reconstructed result is low compared to the camera intrinsiclatency (read-out plus exposure time). Therefore the system architecture is well adapted to high speed tracking tasks. Exposure time may be further reduced by using more sensitive camera sensors or brighter illumination, read-out time with employment of faster cameras. The accuracy of tracked color marker positions in this first setup is not yet sufficient for advanced tracking demands. The main resons for this are the currently used off-the-shelf interlaced video cameras, the sensitivity of the passive color marker detection to environmental lighting changes and the limited resources in the current FPGA for advanced image filtering. We are addressing these problems with the second iteration of the tracking system which is currently been developed. 5. Future Work
Having shown the principal feasibility of our modular approach to tracking, we need to improve the accuracy of tracking in the next step. We are replacing the off-the-shelf interlaced PAL video cameras with high resolution fullframe CMOS cameras. New advanced hardware with extended FPGA resources has been developed in our group and we are now implementing advanced image segmentation algorithms. In the future we will also look into using shape recognition for more accurate marker detection. To increase robustness of color marker detection we are evaluating a dichromatic reflection model 8 , that uses a HDI-colorspace better suited to segmentation of color markers under changing environmental lighting con-
108
ditions. We are also adding IR-illumination to the cameras to use standard IR-illuminated passive markers. This allows the combination of both techniques by using colored IR-reflecting markers and a mixture of color and IR-sensitive cameras, uniting the advantages of both marker types. We will study advanced global optimization algorithms like bundle adjustment to perform an online recalibration of the complete system during tracking. We expect this to reduce the influence of aging and temperature drift the current systems are experiencing. The same procedures may also be used t o integrate additional camera modules into an already running tracking system. All these extensions will be tested as tracking component with the handheld manipulator tool ITD. However for the long-term future we are erivisioning a completely tracked operating room.
Acknowledgments The work on project MOSCOT is supported by University of Heidelberg, Fakultat fur Klinische Medizin Mannheim under grant FF Jun 01/932630, and the Deutsche Forschungsgemeinschaft under grant MA 1150/39-1.
References 1. Martelli S, Bignozzi S, Bontempi M, Zaffagnini S, Garcia L(2003), Comparison of an Optical and a Mechanical Navigation System, Proc. of 6th International Conference of MICCAI, pp. 303-310, Montreal, Canada 2. Stifter J, Anner A, Riede R(2003), Precision and validation of optical tracking devices. Proc of 3rd Annual Meeting of CAOS, Marbella, Spain 3. Hartley R, Zisserman A(2000), Multiple View Geometry in Computer Vision. Cambridge University Press, 2000 4. Zhang Z (2000), A Flexible New Technique For Camera Calibration, IEEE Trans. Pattern Analysis, vol. 22, no. 11, pp. 1330-1334 5 . Triggs B, McLaughlan P, Hartley R, Fitzgibbon A (2000), Bundle Adjustment, A modern synthesis. Proc. of International Workshop on Vision Algorithms: Theory and Practice, pp. 298 - 372, Corfu, Greece 6. Liu B, Yu M, Maier D, Manner R(2003), Accelerated Bundle Adjustment in Multiple-View Reconstruction, 7th Int. Conf. on Knowledge-Based Intelligent Information & Engineering Systems (KES2003), pp 1012-1017, Oxford, UK 7. Pott PP, Schwarz MLR, Kopfle A, Wagner A, Badreddin E , Manner R, Scharf HP, Weiser P (2003), ITD - A Handheld Manipulator for Medical Applications: Concept and Design. Proc of 3rd Annual Meeting of CAOS, Marbella, Spain 8. Kravtchenko V, Little JJ (1999), Efficient Color Object Segmentation Using the Dichromatic Reflection Model. Proc. of IEEE PACRIM, pp. 90-94 9. Xilinx Corporation(2003): Spartan-3 1.2V FPGA Datasheet, www.xilinx.com
DEVELOPMENT OF AUTOCLAVABLE REFLECTIVE OPTICAL MARKERS FOR NAVIGATION BASED SURGERY DIRK SCHAUER, TIM0 KRUEGER AND TIM LUETH
BCMM - Berlin Center for Mechatronical Medical Devices Department of Maxillofacial Surgey - Clinical Navigation and Robotics Pro$ Dr. mult. h.c. Juergen Bier and Pro$ Dr. Tim. C Lueth Medical Faculty Charitd - Humboldt University at Berlin Fraunhofer-Institute for Production Systems and Design Technology Augzrstenbzrrger Platz I , I3353 Berlin, Germany The goal of the prcscntcd work was the development of autoclavablc rcflcctivc rnarkcrs for navigation bascd surgcry. Passive optical markcrs consist of gcomctric bodies coatcd with a diffusc reflective matcrial. Thcy are partly stcrilisablc in saturated stcam (autoclavation) but arc scnsitivc to disturbances duc to wound liquids and tissue particlcs. Glass sphcrc rnarkcrs use thc physical cffccts of rcfraction and rcflcction. Thcy arc easy to clean operatively but allow no autoclavation. Thrcc milestones were defined to achicvc a quality improvcmcnt of the glass sphere tcchnology. Diffcrcnt antircflcctivc and reflective laycr materials and coating technologics werc discussed and validatcd. A suitable adhesive for the connection betwccn tool and marker must allow a high elasticity to compensate thermally induccd mcchanical stresses and must be certified duc to the USP 23. Differences in the specific thermal coefficients of tool and glass rcsult in mcchanical stresses during the heating. Thc adhesive joint must compcnsate those forces and offer a high precision in markcr positioning. We dcvcloped a design and coating technology which considers all these corncr points successful. A validation of the new reflective markers showed that the markcrs are autoclavable for at least 50 stcrilisation cycles whilc the optical accuracy is equal to wcll established rnarkcrs in use for surgical applications. The introduced rnarkcrs provide a high efficiency and rcduccd costs for frequently used optical navigation systems.
1.
State of the art in optical position measurement
The optical position measurement maintains the gold standard of computer assisted navigation techniques in medicine. This method is distinguished to a high precision and robustness compared to other measurement technologies like ultrasound, magnetic field or mechanical based solutions. The significant disadvantage of the optical position measurement is its signal disturbance ensues from a interference of the optical connection between sensor and localizer. The optical position measurement permits a instrument movement within a big work space in all spatial degrees of freedom. The optical sensors consist of three column-and-row or two planar sensors in a known relative position. Comparable to the human eyes, the optical sensors
109
110
register linear or planar signals coming from optical markers and calculate their spatial position regarding the distance between the sensors. A optical localizer consists of at least three single markers, whose arrangement is known. The accuracy of the measurement increases with the number of markers used for a single localizer. The optical markers define a local co-ordinate system, in which the surgical instrument or the anatomical structure is described. Different optical marker technologies are available today. Active optical markers consist of light emitting diodes (LED), which require a permanent power supply via cables (figure la). Those markers can be sterilised in saturated steam (134 degrees Celsius, 3 bar for 3 Minutes), but the cables are still a ergonomic obstruction as well as a hygienic risk. Passive optical markers reflect infrared light, which is emitted by LED’S mounted onto the optical sensor. Disposable markers consist of plastic material coated with a diffuse reflective layer and are well established and wide spread (figure lb). The sterile reflectors are mounted onto high grade steel posts, which are screwed to the localizer. Wound liquids or tissue particles can not be cleaned away inraoperatively and significantly disturb the signal transmission. .-
a)
Figurc 1. Optical markcrs for navigation bascd surgery: a) light cmitting diodes and b) disposable rcflectivc markcrs with diffuse coating.
Reflective optical markers for industrial applications have been presented by Rohwedder Visotech GmbH. Those markers consist of high precision glass spheres coated with a reflective and a antireflective layer. A high refraction index (n = 1,92) allows a low refraction of the incoming and reflected infrared light. The glass spheres are glued to a localizer matrix. The geometry of the sphere adjustments guarantee a high position accuracy of the optical markers. The markers can be cleaned easily during their use. But they are sensitive to mechanical strains and chemicals which result in scratches and a local blindness.
111
A sterilisation in saturated steam is not possible due to the recommended layers and coating technology.
Figurc 2. Rcflectivc glass sphcrc markcrs: a) optical localizer and rcflectivc glass sphcrc markcr with rcflcction and antireflcction laycr and c) physical principlc o f light transmission.
Autoclavable passive markers have been presented by Precision Instruments Inc., which use the principle of diffuse light reflection (figure 2a). Those markers are still sensitive to local blindness as a result of surface pollution. Video markers have been used initially for biomechanical as well as crash test analyses. Improvements of the CCD-sensor technology allow a higher precision in marker detection nowadays. But the accuracy of the new introduced systems is not comparable to common optical measurement systems and not validated in medical applications yet.
Figurc 3. New trends in optical marker technology: a) autoclavablc rcflcctivc markers with d i f i s c surfacc coating and b) video markers for pattern rccognition.
There are no autoclavable passive markers available, which provide a robust usage including an intraoperative cleaning and easy removal of wound liquids or tissue particles. Those markers would allow a significant cost
112
reduction for navigation based medical applications and offer a high user comfort.
2.
Material and method
The glass sphere markers represent a high potential technology compared to solutions presented so far. The goal of our work was the improvement of the technology regarding their sterilisation in saturated steam. Initial experiments showed that the recommended coating material (reflective layer - Aluminium; anti-reflective layer - Magnesiumfluorid) is destroyed completely during autoclavation and removed from the sphere surface. Mechanical stresses resulting from different specific thermal coefficients of the gluing partners, low adhesive elasticity as well as the design of the adhesive joint resulted in a breakage of the spheres during autoclavation. The anti-reflective layer material is sensitive to mechanical stretches and can be destroyed easily. Regarding this knowledge we defined three milestones for the development of autoclavable reflective markers based on the glass sphere technology: a) Search for suitable reflective and antireflective materials as well as the improvement of the coating technology, b) Selection and validation of a high performance adhesive, c) Optimisation of the adhesive joint between glass spheres and localizer matrix regarding a minimisation of mechanical stresses during heating. A stepwise revision of these problems can result in a powerful alternative to the common marker technologies in optical position measurement in CAS.
2.1. Coating material The coating materials Aluminum for the reflection layer and Magnesiumfluorid for the anti-reflective layer can not be used if sterilisation in saturated steam is essential. They are destroyed during autoclavation and washed away. Magnesiumfluorid is sensitive to mechanical strains, hot water as well as chemicals such as aldehydic or alcoholic liquids. Alternative reflective materials like chrome or silver are not suitable due to their bad mechanical characteristics and can not be used. The layer would be destroyed during the mounting procedure. Suitable materials for the reflective as well as the antireflective layer must be mechanically and chemically robust and provide optimal optical properties. Common medical mirrors and endoscope optics use a special precious medal in connection with a Oxide layer as optical coatings. These materials have been used also for the described task while the successful application also
113
depends on the used adhesive as well as the design of the adhesive joint. Medical mirrors are not glued to their carrier instruments but fixed mechanically to their holder. Accuracy demands are of suppressed priority compared to the high precision optical instruments for measurement applications. 2.2. Adhesive Adhesives for medical instruments in blood contact must be certified due to the ISO 10993 and the USP 23 class 6 or adequate. For medical applications only a limited selection of adhesives is available. According to the product descriptions an autoclavation is possible but the number of cycles is often not known. Different extensions of the gluing partners during the heating can result in mechanical stresses within the adhesive joint. If both partners have equal thermal characteristics those stresses can be reduced but not suppressed completely. Elastic adhesives like silicones are related to a higher absorption of water during autoclavation because of the lower package density of the molecule chains. Solid adhesives like ultraviolet light curing single component adhesives or two components epoxy resins provide lower water absorption but are also related to lower elasticity's. A suitable adhesive for the previously described application must be certified to the legal demands, offer a high elasticity and low water absorption. It should be chemically harmless in liquid state and offer perfect contact conditions with glass, plastics and metal. A selection of adhesives which fulfil the described requests in accordance to the product descriptions is listed in table 1. Table I. Adhesives for medical instruments. Product name LOCTITE 5248 LOCTITE 5092 Vitralit 7989 Vitralit 1702 DYMAX 140-M DYMAX 197-M KATIOBOND 4653 AUTOMIX EPO-TEK 353 EPO-TEK 302-3M
Chemical base Silicone Silicone Acrylic 1-K-Epoxy Acrylic Acrylic 1-K-Epoxy 2-K-Epoxy 2-K-Epoxy 2-K-Epoxy
Curing effect UV-light UV-light UV-light UV-light UV-light UV-light UV-light Chemical Chemical Chemical
Supplier LOCTITE/Sweden LOCTITE/Swcdcn Panacol-Elosol GmbH Panacol-Elosol GmbH Dymax Corp. Dymax Corp. DELO GmbH DELO GmbH Polytec GmbH Polytcc GmbH
An experimental evaluation of the listed adhesives showed that the two components epoxy resin fit perfect to the described requirements. A lightly increase of elasticity with temperatures higher than 65 degrees Celsius offers a mechanically relaxation during the heating.
114
The two components epoxy resin cures in a chemical reaction. The design of the adhesive joint must not consider the transport of light or other curing indicators. The adhesive provides a high viscosity and is self evacuating, which allows perfect gluing characteristics. A heat delivery during the curing can accelerate the curing process but encloses the risk of mechanical stresses. 2.3. Design of the adhesive joint The design of the adhesive joint should allow a simple and safe positioning of the coated spheres as well as an easy alignment of their optical axis. The steady distribution of the adhesive between the gluing partners ensures a high firmness. Mechanical stresses resulting from temperature depending extensions can be reduced by an enhancement of the area in contact. Different designs of the adhesive joint, such as drill holes, conical and spherical hollows have been validated in experimental studies. The design as a pan shaped spherical hollow allows an accurate positioning of the optical marker while the sphere can be glued in a “swimming” state. That means, that a constant adhesive film results from the hydrostatic pressure between the gluing partners and the buoyancy of the sphere. The constant adhesive film prevents the partners from a direct contact and mechanical stress peaks during the heating process. The selection of the ideal localizer or carrier material depends on the specific thermal coefficients of the glass as well as functional and technical boundary conditions. The specific thermal coefficient is nearly equal to titanium, a light, easy to machine and robust metal. But the common localizers are made of plastics like PEEK (Polyetheretherketon) or PEI (Polyetherimid). That circumstance made the design of single optical markers essential, which can be mounted and exchanged easily. The carriers were designed as straight and angular posts, which can be pushed into drill holes within the localizer. The optical axis of the marker can be parallel or oblique, while the centre point of the sphere stays stable. An alternative design consist of a screw post, which has the same screw diameter and sphere centre point like the commercially available disposable markers. Instrument sets can be modified without any additional costs by exchanging posts and markers. Figure 4 shows the prototypes of straight and oblique markers as well as the screw posts.
115
Figurc 4.Glass sphere markcrs: a) straight dcsign, b) obliquc design and c) scrcw post.
A prototype of an autoclavable localizer whose reflective optical markers can be sterilised in saturated steam is shown in figure 5. The optical marker resist the common cleaning procedure for medical instruments including, washing, disinfection as well as sterilisation in steam, gas and radiation.
Figure 5. Prototypc of a autoelavable localizer with passive reflective optical markcrs
2.4. Autoclavation
To examine the behaviour of the coating during autoclavation they were sterilised in saturated steam with a temperature of 134 degrees Celsius and a pressure of 3 bar for 3 Minutes 10 times (validated autoclave). Before and after the autoclavation the markers optical characteristics were tested by measuring their signal-to-noise ratio under defined conditions. The experiment showed that the sterilisation has absolutely no influence to the optical characteristics and the coating is not infected. After gluing the glass spheres to the titanium posts (6 straight and 6 oblique carrier, 6 screw posts) all markers were sterilised in saturated steam for 50
116
autoclavation cycles (conditions like above). The optical characteristics were validated before and after autoclavation by a measurement of the signal-to-noise ratio. Again, no effects were recognised nor measured. They survived all without damage nor any infections to the adhesive or coating. A following autoclavation of the mounted optical localizers showed, that the extension of the plastic carrier can involve the risk of loosening and displacement of the smooth posts during a heating. This effect has to be excluded by additional fixation elements. 3.
Conclusion
The described work shows a successful design and validation of a new type of autoclavable, reflective optical markers for navigation based medical applications. The introduced glass spheres offer significant advantages to the common reflective layer technology. They provide a low rate of wear and can be cleaned easily also during the operation. The use of an anti-reflective Oxide layer could significantly improved the surface firmness to mechanical strains as well as its resistance against chemicals. A disadvantage of the described optical markers are their reduced visibility according to their limited coating angle of the reflective layer. This effect can be compensated by a optimal tool design and a custom-made modification of the optical axis of the instrument. The described glass spheres should not be used in medical applications with the risk of mechanical shocks applied to the navigated instrument. The glass can break and hurt patient or user. The presented passive markers will be available for higher prices compared to disposable markers but their reuse reduces operation costs significantly after their 20-th or 30-th clinical use. A statistic validation of the new technology as a pre-request for the MDD-certification will follow within the near future. 4.
Acknowledgements
This research work has been performed at the Berlin Center for Mechatronical Medical Devices, a co-operation between the Department of Maxillofacial Surgery - Clinical Navigation and Robotics, with Prof. Dr. Juergen Bier and Prof. Dr. Tim C. Lueth, Medical Faculty Charite, Humboldt University Berlin and the Fraunhofer-Institute for Production Systems and Design Technology - IPK in Berlin, Department of Medical Assistant Devices, with Prof. Dr.-lng. Eckert Uhlmann. The work has been supported by the Alfried Krupp von Bohlen and Halbach Foundation. Parts of the research have been supported financially by the European Regional Development Fund (ERDF), the Deutsche Krebshilfe (granted to Prof. Dr. J. Bier and Prof. Dr. Wust) and the
117
Berliner Sparkassen Foundation for Medicine (granted to Prof. Dr. T. Lueth, Dr. E. Heissler and Prof. Dr. B. Hell). Special thanks are also due to the companies RoboDent, NDI and NDI-Europe, Altatec, Ziehm, Instrumentarium, Planmeca, Straumann, Medtronic, and Philips for their support of the project.
ISO-C 3D NAVIGATED DRILLING OF OSTEOCHONDRAL DEFECTS OF THE TALUS: A CADAVER STUDY MUSA CITAK, E N S GEERLTNG, DANIEL KENDOFF, MARTINUS RICHTER, TOBIAS HUFNER, CHRISTIAN KRETTEK Trauma Dept. Hannover Medical School ( M H ) Hannover, 30625, Germany
The rcvascularisation of the defect is thc aim of the opcrativc treatment of ostcochondral lesions of the talus stagc I and I1 according to Berndt and Harty. lntraopcrativly not all locations on the talus can be surcly identified with only arthroscopical support or radiological imaging and so incorrect drillings must bc supposed. Altcrnativly to invasivc trcatments, computer-assistcd navigatcd rctrogradc drillings arc used for quitc a short time and improvcmcnt of prccision of thcsc opcrations could be shown. In 7 human cadaver fcct an ostcochondral Icsion was crcatcd via medial malleolus ostcotomy. Within a cadaver study the prccision, radiation cxposurc and thc opcrativc sct-up of Iso-C-3Dbascd navigation was cvaluatcd in this study. Specific problcms arisc due to the intraosscar metallic fixation of thc DRB causing considerable artcfacts in Iso-C-3D imagcs. This problem was solvcd by the dcvclopmcnt of a new fixation pin with a stablc rotation. Disadvantages of Iso-C-3D navigation rcsult in the ncccssity of a major effort for material rcsourccs.
1.
Introduction
The aim of the operative therapy of osteochondral defects of the Talus is the revascularization of the defect areas [l]. Proper visualizing of the defect intraoperatively with arthroscopy or X-ray image is not guaranteed depending on localization of the defect area. Exact retrograde drilling of these lesions might be problematic and failure of drilling might occur [2,3]. Alternative to open therapy is the use of computer assisted navigated retrograde drilling. This method is in use for a short period of time and has shown precision improvements of drilling procedures. The accuracy and the operation set-up of the Iso-C 3D based computer assisted drilling and resection of osteochondral lesions was determined in a cadaver study.
2.
Material and Methods
In 7 human cadaver feet an osteochondral lesion was created via medial malleolus osteotomy. The dynamic reference base was positioned in the head of the talus with rotation stable new developed single screw. The Iso-C 3D threedimensional image data was performed by a c-arm scan and sent to a Surgigate 118
119
dimensional image data was performed by a c-arm scan and sent to a Surgigate navigation system Medivision. The defects were visualised in multidimensional layers. Defined trajectories were used to define the entry and depth of the planned drilling. With permanent navigation control the drilling was performed with a 2.5 mm drill, following the preplanned trajectories. To check results an Iso-C 3D Scan was performed at the end of the protocol. Another conventional computer tomography scan of every cadaver was taken to assure the results before anatomic control with opening of the malleolous was performed.
Figure 1: An osteochondral lesion was created via medial malleolus osteotomy. praeoperative Iso-C 3D scan.
3.
Results
The result showed exact retrograde drilling of all lesions with the 3.2 mm drill. No drill failures occurred and the planned trajectories were confirmed. The accuracy was confirmed with immediate intraoperative 1 s 0 - C ~ ~ and postoperative CT scans. Both modalities showed same results and were congruently. Dissecting via the medial malleolus osteotomy showed no anatomic perforation of the drill in the talus.
120
Figure 2: Intra-operativenavigated drilling.
4.
Discussion
The use of computer-assisted navigated retrograde drilling of osteochondral lesions has been described with promising results as a new technique. Currently used computertomography (CT) [5]and fluoroscopy-based navigation are limited in their flexibility and in their intraoperative image data [3]. Advantage of the three dimensional navigation is the direct visual control of the drilling procedure in multiplanar reconstructions which allows exact drilling of also anatomic difficult regions. So far 1 s 0 - C ~navigation ~ still needs accessorily equipment with extra cost and training of personnel. Whether there will be significant advantage in comparing to conventional methods under operative conditions has to be shown in further clinical studies.
References [ 11 Bemdt AL, Harty M (1959) Tranchondral fractures (osteochondritis dissecans) of the talus. J Bone Joint Surg Am 41: 988-101 8
121
[2] Morgan CD (1991) Gross and arthroscopic anatomy of the ankle. 1n.McGinty JB (ed) Operative arthroscopy.Raven Press,New York, pp 677-694 [3] Kendoff D, Geerling J, Mahlke L, Citak M, Kfuri M, Hufner T and Krettek C. 2003 Navigated Iso-C3D-based drilling of an osteochondral lesion of the talus. Unfallchirurg 106:963-967 [4] Conti SF,Taranow WF (1996) Transtalar retrograde drilling of medial osteochondral lesions of the talar dome. Operat Tech Orthop 6: 226-230
[5] Fink C, Rosenberger R.E, Bale R.J, Rieger M, Hack1 W, Benedetto K.P, Kunzel K.-H, Hoser C. (2001) Computer-assisted retrograde drilling of osteochondral lesions of the talus Orthopade 3059-65
DEVELOPMENT OF A NAVIGATION SYSTEM FOR TRANSCRANIAL MAGNETIC STIMULATION (TMS) AXEL WECHSLER, STEFAN WOESSNER, JAN STALLKAMP Fraunhofer-Institute for Manufacturing Engineering and Automation, Nobelstr. 12, 70569 Stuttgart, Germany AXEL THIELSCHER, THOMAS KAMMER University of Ulm, Department of Psychiatry III, Leimgrubenweg 12-14, 89075 Ulm. Germany
TMS is currently used in neurology to explorc thc cortical representations of certain functions of the brain. In psychiatry, it is intended to be used for the treatment of psychological disorders such as depression. Currently, this method is evaluated in clinical studies around the world. In both cases, the knowledge of the precise location and orientation of the induced currents is essential for the interpretation of measurement results. In close collaboration with the University Hospital Ulm, Germany, the Fraunhofer Institute for Manufacturing Engineering and Automation (IPA) has developed a navigation system especially adapted to TMS requirements.
1.
Transcranial Magnetic Stimulation (TMS)
1.1. Principle TMS is based on the non-invasive generation of current pulses on the cortex. Those current pulses are induced by magnetic field pulses that propagate through the skull. Therefore, the magnetic field pulses can be generated outside the skull. This is achieved by a coil of electric conductors and an electric pulse generator that applies high-current pulses to the coil.
Figure 1:Pulse generation on the cortex
122
123 The electric currents that are induced on the cortex surface stimulate the cortex in the region of their application. 1.2. Parameters
The stimulation results depend on the current distribution on the cortex. This current distribution is influenced by several parameters such as the pulse’s length, the current’s amplitude, location, and direction, and the shape of the stimulation coil. The stimulation results also depend on the pulse frequency and the total number of pulses applied.
1.3. Applications
f
a 75
Figure 2: Calculated current distribution on the cortex
Neurology In Neurology, TMS is used to map the cortical representation of certain neurological functions such as muscle contraction and visual perception. In so-called motor-mapping-experiments, the stimulation coil is manually moved to pre-defined grid-positions around the proband’s head. When the desired location is reached, the stimulation pulses are applied and the proband’s muscular reaction is recorded. By mapping the reactions to the grid locations a map of the cortical representation of muscles is generated. A comparison of the
124
individual motor-maps allows for identification of analogies and individual differences in cortical muscle representations. The visual cortex is mapped similarly, except that the proband's reaction cannot be measured, but has to be described by the proband himself. Psychiatry The use of TMS as a method for treatment of psychological disorders is currently evaluated in clinical studies. The therapeutical effects of TMS are not well known yet. Early studies suggest a simple rule for locating a region of the prefrontal cortex, whose repeated stimulation by TMS is said to ease depression. Researchers at the University Clinic Ulm, Germany, department Psychiatry 111, focused on the verification of this simple rule'. Additionally, they are looking for alternative ways of identifying this therapeutically relevant region. Obviously, there are more applications2 to TMS, but these are the areas the navigation system was intended and designed for. A common question in TMS is the value of a threshold of electric pulse current beneath which reactions do not appear. The measurement of this threshold value depending on the stimulated region, the stimulation coil shape and the stimulated individual is very common in TMS practice.
2.
Navigation Systems
2.1. Navigation Systems for Surgery
There are several systems available for surgical navigation. They are designed for pre-surgical planning and visualization of tools and the situs during the operation. Most navigation systems for surgery are based on optical trackers in order to achieve a high measurement precision. A navigated surgery procedure starts with the acquisition of patient data, mostly achieved by MRT or CT. Based on these data, the surgeon uses the navigation system software to plan operation procedures and paths. After registration of the prerecorded 3D-data with the fixated, anesthetized patient, the pre-surgical planning data is visualized together with real-time tool location information during surgery to help the surgeon reproduce the pre-planned procedure. With some surgical navigation systems, re-planning during surgery is possible. This can be very helpful in case of unpredictable complications like extensive bleeding.
125
2.2. Special requirements of TMS Navigation systems to be used for TMS face some additional requirements: 1. Surgical navigation systems are designed for fixated patients. During TMS the patient is awake and able to move. 2. Surgical interventions are not performed repeatedly on one patient; therefore, registration results do not have to be reproducible. Commonly, TMS is applied repeatedly to the individual, with intervals of hours, days, or weeks. Therefore, to achieve reproducible examination results, the registration results have to be reproducible. 3. Since surgical navigation systems are used for pre-planning, interaction during surgery is mostly limited to visualization, there is almost no input generated during surgery. In TMS, the examiner almost constantly interacts with the software, since viewing angles change constantly. To achieve a comfortable interaction, a navigation system for TMS must be controllable and viewable from a distance of a few meters. 4. Most surgical navigation systems cannot be used for documentation purposes. In TMS, the documentation of multiple sessions is essential.
3.
Navigation System Design for TMS
To meet the additional requirements mentioned above, a new navigation system especially designed for TMS applications was developed. In accordance with surgical navigation systems, an optical tracking system was chosen to optimize the tracking precision. Since the patient is awake and free to move his head during TMS, the position of the head has to be tracked continuously using a tracked rigid body that is fixed to the head. This is done using a rubber band in order to avoid patient discomfort. Reproducibility of the registration is achieved through a two-step registration routine. As with surgical navigation systems, the registration is based on anatomical landmarks. To enhance registration precision, the registration routine can additionally handle surface points. In the first registration step, a head coordinate system, defined by three anatomical landmarks, is calculated. All measured landmarks are then transformed to this coordinate system. Using the software Brain VoyagerTM,a constant transformation from the head coordinate system to the MRT data coordinate system is calculated. This step involves the surface points and is performed only once. In the second step, the positions of several anatomical landmarks are measured, and a transformation from the measurement coordinate system to the head coordinate system is calculated. Using both the nonpermanent transformation between measurement coordinate system and head coordinate system and the pre-recorded permanent transformation from the head
126
coordinate system to the data coordinate system, measured coordinates can be transformed to head coordinates which are used for documentation purposes and to data coordinates which are needed for visualization purposes. Using this technique, the recorded head coordinates for identical stimulation positions are the same in every session. In order to achieve remote operability, the user interface includes several features like large buttons, zoom, wireless pointing device and acoustic feedback. The documentation of TMS-studies comprises the location and orientation of the coil in head coordinates that can also be exported to data formats used by other software. It also includes the time of the stimulation and a name and description of the location, if desired. Since the stimulation coil is of considerable weight, the tremor of the examiner’s hand may not remain unconsidered. Therefore, a TTL pulse, issued by the stimulation pulse generator is used to determine the exact time of stimulation and to store the coil location information at that time.
Figure 3: Screenshot of navigation software “BrainView 2”
4.
Results
The development of the navigation system for TMS resulted in the software Brainview 2 which is validated currently in daily use at the University Clinic Ulm, Germany, in the department Psychiatry 111. It is used for clinical studies evaluating the potential use of TMS in depression therapy and in mapping experiments including motor mapping and mapping of the visual cortex.
127
Some preliminary results, obtained using this software, have so far been published: 4.1. Neurology
Motor Mapping Motor-mapping-experiments have shown that the measurement precision in six degrees of freedom (6DOF) obtained using this system is needed to generate reproducible motor maps3. The current threshold is observed to be significantly lower than previously measured using mechanical 3DOF tracking. This can be explained by the improved spatial resolution and the additionally gained orientation information. This lower threshold leads to the conclusion that the orientation and precise location of the current on the cortex play a vital role in the stimulation’s impact on cortical activity. This observation could possibly be explained by assuming that the cortical representation of a certain muscle is rather small compared to the “size” of the current distribution on the cortex.
Mapping of the Visual Cortex Mapping-experiments of the visual cortex4 also showed the lowered threshold mentioned above, which is explained accordingly. Differing from the motormaps, visual maps show a very wide individual variation.
Figure 4: Maps of the visual cortex of four individuals
128
4.2. Psychiatry Clinical studies verifying a simple standard procedure’ for finding a cortex region potentially relevant for depression treatment using TMS showed that this guideline is too imprecise to be valid. Using the developed navigation system, the individual differences in the cortical location of this region were identified to be greater than the region itself, invalidating that standard procedure.
5.
Conclusion
The development of Brainview 2 was necessary to meet the special requirements inherent to TMS. It is currently in daily clinical use and proved to be well adapted to the TMS laboratory needs. Using the navigation system, researchers could optimize the precision of cortical maps and show that threshold levels were previously measured too high resulting from imprecise tracking methods.
References 1. U. Henvig, F. Padberg, J. Unger, M. Spitzer and C. Schoenfeldt-Lecuona. Transcranial Magnetic Stimulation in Therapy Studies: Examination of the Reliability of “Standard” Coil Positioning by Neuronavigation. Biol. Psychiatv, 5058-61,2001. 2. Mark S. Goerge, Stimulating the Brain, Scientific American, September 2003: 33 - 39,2003 3. A. Thielscher and T. Kammer. Linking Physics with Physiology in TMS: A Sphere Field Model to Determine the Cortical Stimulation Site in TMS. Neurolmage, 17:1117-1130,2002. 4. Thomas Kammer and Sandra Beck. Phosphene thresholds evoked by transcranial magnetic stimulation are insensitive to short-lasting variations in ambient light, Experimental Brain Research,Vol. 145, No. 3: 407 - 410, 2003
FLUOROSCOPY BASED NAVIGATED DRILLING OF FOUR OSTEONECROTIC LESIONS IN ONE PATIENT MUSA CITAK, E N S GEERLING, DANIEL KENDOFF, HENDRIK m B B E N , CHRISTIAN KRETTEK AND TOBIAS HUFNER Trauma Depi. Hannover Medical School (MHH) Hannover, 30625, Germany
The precise drilling of bone areas witch are available difficultly during surgery is the described problem. Preoperatively there are generated accurate three-dimensional data witch are not available during surgery. Because of the fluoroscopy-based navigation there is a permanent drill control and the accuracy might be higher. Another advantage is that fluoroscopy time can be decrease. A 27-year-old female patient with a NonHodgkin Lymphoma was treated with Cortisol. After this a bilateral femoral and humeral head necrosis was diagnosed. Because of her age, an operation with drilling of all four extremities was indicated. Several drillings of 3.5 and 4.0 mm have been made with the aid of a fluoroscopy based navigation system. The realization of a navigated drilling of all four extremities with osteonecrosis within one operation was not a problem. Because of the time-consuming .covering and relocationing of the patient and the complex set-up of the navigation the operation took 180 minutes. The total time of X-ray exposure including all adjustments and the intraoperative navigation of all four extremities was 48 sec. The postoperative images confirmed the CoiTect position of all drillings.
1. Introduction
The precise drilling of osteonecrotic bone areas are challenging with normal fluoroscopic control. Preoperatively there are generated accurate threedimensional data witch are not available during surgery. Intraoperatively there are just fluoroscope images available. For each drilling the fluoroscope has to be moved so the drilling can become imprecise. Because the fluoroscopy-based navigation allows navigation in up to four images the accuracy might be higher. Another advantage is that fluoroscopy time can be decrease. The advantages of the navigation are shown in a clinical example of drilling osteonecrosis near articular surfaces of all four extremities.
2.
Clinical case
A 27-year-old female patient with a Non- Hodgkin Lymphoma was treated with Cortisol. After this a bilateral femoral and humeral head necrosis was diagnosed. In MRI a FICAT stage 3 and 4 was diagnosed. 129
130
Figure 1: In the preoperatively taken MRI necrotic areas are clearly visible
Because of her age, an operation with drilling of all four extremities was indicated. Several drillings of 3.5 and 4.0 mm have been made with the aid of a fluoroscopy based navigation system (Medivision, Surgigate, Switzcrland) of the four osteonecrosis. A big diameter of the drills was used to minimize the drill flexion. The dynamic reference base was positioned close to each defect. Two X-ray images were taken in two layers. Trajectories were planned and the drilling was performed percutaneously. Finally two X-ray images were taken of the virtual end-position of the drill.
3.
Results
The realization of a navigated drilling of all four extremities with osteonecrosis within one operation was not a problem. Because of the time-consuming covering and relocationing of the patient and the complex set-up of the
131
navigation the operation took 180 minutes. The total time of X-ray exposure including all adjustments and the intraoperative navigation of all four extremities was 48 sec.
Figure 2: Intra-operativenavigated drilling of the left femur
Only two or three images were needed for registration, the other images were used for set-up and adjustment. The postoperative images confirmed the correct position of all drillings.
4.
Conclusion
Fluoroscopy-based navigated drilling of osteonecrotic lesions made it possible to raise accuracy and to minimize radiation time. The permanent visualization of the axis of the drill helps the surgeon to avoid unnecessary traumatisation of the bone. By resampling the images in contrast it is possible to demark the area of
132 interest better than in conventional fluoroscope images. In the hture it will be made possible to raise the accuracy even more by combining MRI and fluoroscopy.
Figure 3: Intra-operative navigated drilling of the left femur.
ISO-C 3D ACCURACY-CONTROL AND USEFULLNESS AT CALCANUES OSTEOSYNTHESIS DANIEL KENDOFF, E N S GEERLING, MARTINUS RICHTER, TOBIAS HUFNER MUSA CITAK, CHRISTIAN KRETTEK Trauma Dept. Hannover Medical School (MHH) Hannover, 30625, Germany MAURICIO KFURI. JR. Alexander v. Hurnboldi Stiftung; Ribeirao Preto Medical School Sao Paulo, Brazil
lntraopcraivcly bidimcnsional imaging mcthods arc usually limitcd to show dctails of complex calcancus fractures. Computcr tomography (CT) has bccn the standard mcthod on dccision-making and postopcrativc analysis. Introduction of the mobilc Iso-C 3D could bc a rcasonablc solution in order to achicvc three dimensional intraopcrativc control. Cadaveric fcct wcrc asscmblcd to calcancus osteotomics including the latcral articular surface. Aftcr plate fixation fractures werc on purpose anatomically rcduced in two groups and articular steps of 0,s;1,s: 2 and 3 mm simulated. Combined groups with correct screw placemcnt and misplacemcnt were included. Measurements with the Iso-C 3D, conventional c-arm imaging and CT were done and compared by diffcrent surgeons. Observers were able to detect all articular steps of 1,mm one obcrservcr also 0,5mm and all screw misplacements. Comparison between Iso-C3D data and CT showed same results concerning intraarticular steps and screw misplacements. Dctcction of screw misplacement often missed during conventional fluoroscopic examination is possible. Intraoperative analysing of the provided multiplanar reconstructions by the Iso-C 3D offers an immediate control of reduction and the screw placement at calcaneus fractures, enabling operative consequences during same operative procedure.
1.
Introduction
The calcaneus is a tarsal bone often injured due high-energy trauma. [l] The bidimensional image methods are usually limited to show details of its complex shape. Therefore the computer tomography has been the standard method on decision-making and postoperative analysis. [2] The problem remains at the operation theater, where the judgment should be done acutely and is usually assisted by bidimensional image techniques. The intraoperative tomography is foreseen but costly. The introduction of mobile Iso-C 3D could be a reasonable solution in order to achieve a three dimensional operative control. [3] We designed an experimental study to evaluate the precision of Iso-C 3D regarding the detection of articular steps and implant misplacements on calcaneus. 133
134
2.
Materials and Methods
Cadaveric feet were assembled with calcaneus osteotomies addressing the articular surface of the lateral joint fragment, according to Sanders classification. [4] The fracture was then fixed with a Sanders plate (Synthes). We created four different simulations. In the Group 1 the fracture was anatomically reduced with normal screw placement. In the Group 2 we simulated articular steps of 0.4mm, 1.46mm, 1.98mm, and 3.5mm (Figurel).
Figure 1: Iso-C 3D Scan of a reproduced intrarticular step
The screws however were normally placed. We measured the steps with a caliper (CD-15CP Mitutoyo Inc, Aurora, Illinois, USA) with an accuracy of 0,lmm according to producer. Anatomical articular reduction accomplished by screws misplacements constituted the Group 3 (Figure 2). And finally, in the
135 Group 4, articular steps of 0,5mm and 1,5mm were simulated with associated misplacement of screws. All the groups were submitted to traditional radiographic control including AP view, Lateral view, Broden 100, Broden 200 and Broden 300. They were also submitted to a tomographic control. The IsoC control was done with the protocols slow and quick and an arc of movement of 1800. In order to certify if the foot decubitus could interfere with image quality, the IsoC slow protocol was done with the foot on lateral decubitus, on 300 external rotation and on 600 external rotation. A point in the middle of articular fragment was taken as a reference regarding the measure of real step and the virtual step showed on the monitor. The known length of screws was compared with the virtual length showed on the monitor.Finally conventional CT scans were done to improve the measurements.
Figure 2: Iso-C 3D scan of intrarticular screw placcrnent
136 3.
Results
Fifteen measured distances between titan pins on foam bodies were compared with virtual measures by the three observers and were not significant. (p=0.29;p=0.39; p=0.42). Comparing the length of holes were also not significant. (p=0.34; p=0.43; p=0.47). Observers were able to detect all articular steps of 1,mm one oberserver also 0,5mm and all screw misplacements. No interference concerning foot decubitus was recognized. Comparison between Iso-C3D data and CT showed same results concerning intraarticular steps and screw misplacements.
4.
Discussion and Conclusion
Distances between definite points and the length of trajectories can precisely measured with Iso-C3D in foam geometric body. Articular steps as l.0mm are recognizable at calceneus fractures. Detection of screw misplacement often missed during conventional fluoroscopic examination is possible. Intraoperative analysing of the provided multiplanar reconstructions offers an immediate control of reduction and the screw placement, enabling operative consequences in same operative procedure. Revisions in calcaneus surgery might decrease, post operative CT scans can be reduced. Clinical studies need to evaluate the clinical consequences and benefits.
References 1. Zwipp H, Tscherne H, Thermann H, et al. Osteosynthesis of displaced intraarticular fractures of the calcaneus. Results in 123 cases. Clin Orthop 290:76-86,1993. 2. Eastwood DM, Phipp L. Intra-articular fractures of the calcaneum: why such controversy? Injury 28 (4): 247-259, 1997. 3. Kotsianos D, Rock C , Euler E, Wirth S, Linsenmaier U, Brand1 R, Mutschler W, Pfeifer KJ. 3-D imaging with a mobile surgical image enhancement equipment (ISO-C-3D). Initial examples of fracture diagnosis of peripheral joints in comparison with spiral CT and conventional radiography. Unfallchirurg 104(9): 834-8, 2001. 4. Sanders R, Fortin P, DiPasquale T, et al. Operative treatment in 120 displaced intra-articular calcaneal fractures. Results using a prognostic computed tomography scan classification. Clin Orthop 290: 87-95, 1993.
CRANIOFACIAL ENDOSSEUS IMPLANT POSITIONING WITH IMAGE-GUIDED SURGICAL NAVIGATION J. HOFFMANN, D. TROITZSCH, C. WESTENDORFF, F. DAMMA"*,
S. REINERT
Department of Oral and Maxillofacial Surgeq Tubingen University Hospital, Osianderstrasse 2-8, 72076 Tubingen, Germany, * Department of Radiology, Tubingen Tubingen University Hospital, Hoppe-Seyler-Strasse, 72076 Tubingen, Germany
Craniofacial implants providc rctention and cxccllcnt stability for auricular prosthodontic rehabilitation. Thc locations of implant placcmcnt for optimal prosthetic outcomc arc critical to dctcct. We uscd image-guided navigation for implant positioning to test the feasibility and practical impact. Image-guided surgery was pcrformcd by usc of a passive infrarcd surgical navigation system (VectorVisionTM, BrainLAB). The prcoperative computed tomography (CT) data was obtained using thc Somatom Scnsation 16 multislice-scanner (Sicmens). After skull rcfcrcncc array attachment, the paticnt-to-image registration was performed using surface laser scanning technique. A total of 8 implants was placed in the mastoid area and hrthcr misccllancous craniofacial locations. The implant positioning was planned conventionally and updated after image-guided linear measurement in conjunction with the bone thickness. After registration axial, coronal and sagittal reconstructions of the pointer tip position in the regions of intcrcst was displayed in real time. The proper and controlled location and positioning of implants in the craniofacial area could strongly be improved. Navigation-assisted surgical technique can assist in propcr positioning of craniofacial implants, which in turn can complement the prosthetic result.
1.
Background
Percutaneous endosseous craniofacial implants have a high functional and aesthetic impact in supporting prosthetic restoration with minimal morbidity. This applies to patients with specific craniofacial defects where plastic and reconstructive surgery is not possible.' Retrievable facial prosthesis are provided for the auricular, nasal, orbital and midfacial region. The overall craniofacial implant survival rate is very sufficient with a higher satisfaction in patients treated with implant-retained facial prostheses compared with conventional adhesive retained prostheses. Improvements in planning for implant placement have been achieved with CT scanning. CT information enables preoperative bone thickness determination and positioning of implants can be determined in relation to the proposed location of the prosthesis. Accurate three-dimensional (3D) orientation of surgical instrument trajectories is essential as the correct placement of bone implants represents a determining factor in therapeutic profit. Any major change of direction and extension of the pre-planned implant position may participate in a substantial loss of 137
138 biomechanical stability and thus leading to impaired implant survival. Optimal implant placement with consideration of bone tissue supply and preservation of vital structures like the intracranial space, the dura, the mastoid cells and blood vessels from damage implements a stable prosthetic condition. These principles apply for many surgical subspecialties ranging from orthopaedic surgery concerning hip and knee replacements to head and neck surgery craniofacial implant reconstruction^.^^^ Thus, great effort is put into the improvement of the accuracy of implant positioning. Navigation systems are mainly used for a better 3D orientation in anatomically complex sites. In order to improve treatment safety mainly by providing more precise surgical approaches and reduced operating time computer aided navigation applies to an increasing extent to the control of treatment.’-* The use of navigational guidance for medical implant insertion has become particularly suitable in patients with limitations of anatomic orientation, e.g. due to prior extensive ablative tumor surgery, where correct placement of implants is ~hallenging.~. 9. l o Relying on preoperatively acquired computerized tomography (CT) and/or magnetic resonance imaging (MRI) data, current position of surgical devices is displayed on a monitor in near-real-time related to the patient providing topographic orientation at any time. Reliability of optical navigation systems is found to be sufficient in daily clinical routine. The degree of accuracy depending on image acquisition, CT layer thickness, different methods of patient-to-image registration and the navigation procedure itself has been assessed by measuring deviations between anatomic landmarks identified in CT images and corresponding positions on object surface^.^.'^ Yet there is little data concerning the precise amount of 3D angular deviation of pre-planned trajectories performed by navigational guidance.
2.
Materials and Methods
Craniofacial implants were placed using the frameless optical VectorVisionTM Compact (VVC, BrainLAB, Munich, Germany) navigation system. The VVC works on passive marker technology. Tracking of surgical tools followed wirelessly using infrared light technology. Position of instruments is calculated via infrared light reflection using reflective marker spheres fixed to patient and instrument. A total of 8 implants were placed in the mastoid area and various other craniofacial locations with conventional clinical CT scan protocol and regular registration error. Mean amount of angular deviation of navigated drills with
139
regard to the 5 degrees of freedom was assessed to provide a data basis of accuracy for navigated craniofacial implant surgery.
Figure 1. Image-data based surgical planning using a so called “pointer” marking the region of interest in relation to CT 3D-bone reconstruction data.
The intraoperative setup is illustrated in Figure 1. A skull reference array was attached to the patient’s head in a determined position. 3D object-to-image calibration was performed by laser surface registration. A BrainLAB tool adapter consisting of three reflective marker spheres was fixed to the surgical drill. Drills were performed using a conventional surgical trephine on base of reference CT data. 3.
Results
A total of 8 implants were placed in the auricular and different miscellaneous regions. Mean registration error determined by the root mean square (RMS), i.e. the overall deviation for the registration set was 0.86 mm (SD was 0.25 mm). Identifying of and accordingly aiming at the drill starting point determined by the place of penetration and calculated as difference between pre-planned and effective position was done with a high accuracy.
140
Figure 2. LEFT: Target points are marked on bone surface. RIGHT Exactly positioned craniofacial implants by navigation-guidance.
All procedures using image-guided surgical navigation were successful. Implants were placed without causing collateral damage. On the skin surface preplanned position was localized precisely as target point in alignment with multiplanar and 3D-CT data. The target point was then marked (Figure 2). 4.
Discussion
A number of factors contribute to the accuracy of navigational guidance. Image acquisition with parameters like CT layer thickness, voxel size and image data distortion and different methods of patient-to-image registration have a greater impact on navigational precision than technical accuracy and the navigation procedure i t ~ e l f . ~In- ' ~this context, imaging modalities were found to be of minor influence on localization error than number and attachment modality of registration fiducial markers for object-to-image registration.2 A registration protocol based on external fiducial marker technology in turn results in smaller navigation error than matching with anatomical landmarks.* True accuracy can hrthermore be improved by surrounding the operative target with a widespread field of fid~cials.~Concerning registration accuracy, stereotactic frame registration outmatches skin marker or bone fiducial regi~tration.~
141
Figure 3. Navigation monitor screen shot displaying all three dimensions with regions of interest marked by color. Precise information about bone thickness and drill trajectories is provided.
The accuracy of navigation systems has been assessed mostly in experimental studies as clinical evaluation is difficult for various reason^.^' 43 6* 15, 17 In earlier studies on polyurethane milling models concerning the accuracy of navigated drillings for implant placement in the maxilla the mean localization error was found to be smaller than 1 mm.’ Precision of image-guided implant positioning has been investigated in cadaver studies providing important information relating to the degree of accuracy concerning anatomical structures like blood vessels or nerve t i ~ s u e Angulations .~ of trajectories of drilled holes was assessed mostly by computed tomography scan data which in turn aggravates measurement inaccuracies.6 Accuracy of image-guided dental implant surgery has been studied in anatomically complex operation sites.3 In certain cases an insertion of fixtures longer than those planned prior to operation was possible. To overcome inaccuracies mainly emerging from object-to-image registration errors new automatic and marker less laser scanning-based techniques using skin surface registration has become suitable for clinical 93
142
practice." It furthermore may reduce radiation load as there is no need for radiological fiducially marker scan. In conclusion, registration procedure type as a main source of accuracy loss should be based on individual requirements. Finally, only very few prospective randomized clinical trials comparing accuracy and treatment quality of navigational-guided versus conventional interventions are available. Further investigation concerning this topic needs to be done. The clinical benefits emerging from technological advances in surgical navigation are well known. Yet, the complexity of the new technical environment needs to be evaluated carefully taking different types of errors into particular consideration.
References 1. P. J. Schoen, G. M. Raghoebar, R. P. van Oort, H. Reintsema, B. F. van der Laan, F. R. Burlage, J. L. Roodenburg and A. Vissink, Cancer. 92, 3045 (2001). 2. P. Messmer, B. Baumann, N. Suhm and A. L. Jacob, Rofo Fortschr Geb Rontgenstr Neuen Bildgeb Verfahr. 173,777 (2001). 3. A. Wagner, F. Wanschitz, W. Birkfellner, K. Zauza, C. Klug, K. Schicho, F. Kainberger, C. Czerny, H. Bergmann and R. Ewers, Clin Oral Implants Res. 14,340 (2003). 4. R. Metson, Otolaryngol Head Neck Surg. 128, 8 (2003). 5 . J. Hoffmann, F. Dammann, D. Troitzsch, S. Muller, M. Zerfowski, D. Bartz and S . Reinert, Biomed Tech (Berg. 47 Suppl 1 Pt 2,728 (2002). 6. M. Caversaccio, L. P. Nolte and R. Hausler, Acta Otorhinolaryngol Belg. 56, 51 (2002). 7. J. E. Hausamen, J Craniomaxillofac Surg. 29, 2 (2001). 8. A. Raabe, R. Krishnan and V. Seifert, Surg Techno1Int. 11,3 14 (2003). 9. F. Wanschitz, W. Birkfellner, F. Watzinger, C. Schopper, S. Patruta, F. Kainberger, M. Figl, J. Kettenbach, H. Bergmann and R. Ewers, Clin Oral Implants Res. 13, 59 (2002). 10. U. Meyer, H. P. Wiesmann, C. Runte, T. Fillies, N. Meier, T. Lueth and U. Joos, Br J Oral Maxillofac Surg. 41, 102 (2003). 11. J. Berry, B. W. O'Malley, Jr., S. Humphries and H. Staecker, Ann Otol Rhino1 Laryngol. 112,689 (2003). 12. J. Claes, E. Koekelkoren, F. L. Wuyts, G. M. Claes, L. Van den Hauwe and P. H. Van de Heyning, Arch Otolaryngol Head Neck Surg. 126, 1462 (2000). 13. A. Gaggl and G. Schultes, Int J Oral Maxillofac Implants. 17,263 (2002).
143
14. S. Nishihara, N. Sugano, M. Ikai, T. Sasama, Y. Tamura, S. Tamura, H. Yoshikawa and T. Ochi, JKnee Surg. 16,98 (2003). 15. R. Steinmeier, J. Rachinger, M. Kaus, 0. Ganslandt, W. Huk and R. Fahlbusch, Stereotact Funct Neurosurg. 75, 188 (2000). 16. N. Sugano, T. Sasama, Y. Sato, Y. Nakajima, T. Nishii, K. Yonenobu, S. Tamura and T. Ochi, Comput AidedSurg. 6 , 195 (2001). 17. F. Watzinger, W. Birkfellner, F. Wanschitz, F. Ziya, A. Wagner, J. Kremser, F. Kainberger, K. Huber, H. Bergmann and R. Ewers, Plast Reconstr Surg. 107,659 (2001). 18. R. Marmulla, S. Hassfeld, T. Luth and J. Muhling, J Craniommillofuc Surg. 31,267 (2003).
A HYBRID APPROACH TO MINIMALLY INVASIVE CRANIOMAXILLOFACIAL SURGERY: VIDEOENDOSCOPICASSISTED INTERVENTIONS WITH IMAGE-GUIDED NAVIGATION J. HOFFMANN, D. TROITZSCH, C. WESTENDORFF, F. DAMMA”*, S . REINERT Department of Oral and Maxillofacial Surgey , Tiibingen University Hospital, Osianderstrasse 2-8, 72076 Tiibingen, Germany, * Department of Radiology. Tubingen University Hospital, Hoppe-Seyler-Strasse, 720 76 Tiibingen, Germany
The application of minimally invasive approaches in tcrms of complex craniomaxillofacial procedures requires an advanced technology comprising imagc data based surgical navigation. In addition, an increased experience with cndoscopically-assisted and navigationally-guided techniques. as well as the development of dedicated instruments, has opened access for minimally invasivc procedures to the craniofacial region. One such option is the hybrid approach to craniomaxillofacial and plastic surgery. This combines minimally invasive endoscopic techniques with image-guided surgical navigation technology. For assessment of the performance of surgical navigation thirty-eight patients covering a broad range of treatment indications were scheduled for image-guided surgcry by use of a wireless passive infrared surgical navigation system (VectorVisionTM, BrainLAB). The preoperative computed tomography (CT) data was obtained prior surgery. Patient-to-image registration was performed by laser surface scanning. The registration accuracy was expressed by a calculated value, the root mean square (RMS). The system has been used for endoscopically assisted surgery, for the inscrtion of dental and craniofacial implants, for the reduction of complex midfacial trauma, for skull bone tumor removals and further miscellaneous intcrventions. The overall mean accuracy (RMS) was 1.24 (SD: 0.58). Preliminary outcome of patients receiving hybrid approach technology has been remarkably positive. Image guidance offers helpful assistance especially in cases with complex anatomy. Despite some limitation, many craniomaxillofacial problems can be solved safely and effectively.
1.
Background
Technological advances in video imaging, endoscopy and in the development of specific surgical instruments made it possible to convert many procedures in different surgical specialities from open to endoscopic or endoscopically assisted procedures. For many interventions, the “invasiveness” involved has been dramatically reduced resulting in a superior outcome presenting clinically in reduced morbidity, fewer complications and a quicker return to fhctional health and social activity.’ The main advantages arising from surgical navigation are a better threedimensional orientation, a more confident surgeon, a more precise and a less invasive surgical approach. Recent studies have shown that the use of navigation systems provides information about anatomical structures and 144
145
updated orientation of surgical instruments on the basis of preoperative data obtained by computerized tomography (CT) or magnetic resonance imaging (MRI).2-7The craniofacial skeleton belongs to the very complex regions of the human body. Traditional access techniques involve some type of scalp, facial or extensive intraoral incision. However, the increased experience with endoscopically assisted and image data based techniques, as well as the development of dedicated instrumentation, has allowed to perform minimally invasive procedures even in craniofacial region^.^. 7-’0 Furthermore, when dealing with complex procedures, navigated techniques can complement the exposure offered by traditional approaches, avoiding or minimizing the need for facial incisions or osteotomies. The purpose of this study was to evaluate the performance of surgical navigation for use in craniomaxillofacial surgery regarding the “state-of-the-art”-techniques provided. 2.
Materials and Methods
Our experience comprises patients presenting a variety of diagnoses who were surgically treated with the use of surgical navigation, in some cases in combination with endoscopically assisted techniques. Thirty-eight patients with different indications were scheduled for image-guided surgery by use of a passive infrared surgical navigation system. “Endoscopically assisted” surgery was defined as surgery in which the endoscope was used to complement a traditional approach and to visualize the region of interests or to ascertain removal such as osteosynthesis material or foreign bodies. For intraoperative surgical navigation, a wireless and passive reflecting system (VectorVisionrM, BrainLAB) was used. The application of this system in neurosurgical settings has been described in detail.” It consists of two infrared cameras, a computer workstation with high resolution monitor, universal instrument adapters and a pointer. The VectorVisionTM-system works with the so called “passive marker technology”, i.e. infrared light emitting diodes are positioned around these cameras, thus allowing arm- and wireless tracking of pointers or surgical instruments fitted with reflective markers on the universal tool adapters. The software contains various tools for virtual pointer elongation, surgical path planning and linear measurement. The preoperative computed tomography (CT) data was obtained before surgery using a newest generation Somatom Sensation 16 scanner (Siemens). The data sets were transferred via network in DICOM format to the systems workstation.
146
After attaching the skull reference to the patients head (Figure l), the patient-to-image registration was performed using surface scanning with a class I laser device called “z-touchTM”.
Figure 1. Intraopcrative situs illustratcs the setup for endoscopic assistcd and navigation-guided surgery. A dynamic reference army is fixcd on the skull. The endoscope is attached with a rcfemncc adapter and integrated in the navigation environment using an instrument calibration matrix (bluc box with three marker sphcrcs).
The advanced surface matching algorithm utilizes individual patient anatomy as a reference system by acquiring up to a hundred surface points without actually touching the patients and without necessity of any fiducial markers.I’-I4 After registration axial, coronal and sagittal reconstructions of the pointer tip position were displayed in real-time on the navigation screen. The registration accuracy was expressed by a calculated value (the root mean square (RMS), i.e. the overall deviation for the registration set).I5For system validation the intraoperative accuracy is visually checked with identification of anatomical landmarks.
147
3.
Results
The system has been used for endoscopically assisted procedures, the insertion of dental implants, in complex trauma cases, for the resection of bone lesions and further miscellaneous interventions. Intraoperative navigation and endoscopic-assisted procedures were successful in all patients. All surgical areas healed uneventful with barely visible incisions. We did not encounter any navigation- or endoscopically-related intra- or postoperative complications. Some important features were preoperative 3D-visualization and imageguided target fixed surgery with the necessity to indicate at any given time the topographic relationship between the instrument and the patient’s anatomy.
Figure 2. Hybrid approach: Image-data-based navigational guided transnasal endoscopy for removal of a foreign body.
Laser scanning surface registration offers a newest technical step to register the patient without using fiducial markers in shortest time. We found this is a
148
reliable and ease-to-use method to register in clinical settings within few seconds, which offers the possibility to use navigation for every procedure, in which preoperative image scan were taken. The overall mean accuracy (RMS) was 1.24 (SD: 0.58) when referencing was based on laser registration. The overall time expenditure to use the system was 20 to 30 minutes. Handling of the equipment was judged to be efficient and simple in most cases.
Figure 3. Image-data-based navigationally guided endoscopically assisted implant removal in the mandible.
4.
Discussion
The technological innovations in surgery are just at the beginning. Building on the precedent of pelviscopy in gynecology and arthroscopy in orthopaedic surgery, the use of minimally invasive approaches into other surgical specialities, including general surgery, thoracic surgery, plastic and reconstructive surgery, has changed not only the performance of specific
149
operations but more important the strategic approach to many surgical
sub special tie^.^ Surgical endoscopic-assisted techniques to craniomaxillofacial surgery can be applied by modified techniques used in aesthetic and reconstructive surgery.'. 10. 16, 17 Most of the experience with endoscopic surgery has been made by applications in aesthetic surgery, where a premium is placed on size and location of surgical incisions and not least on ease of r e c 0 ~ e r y . l ~ The endoscopic or less invasive approaches involve special challenges. First, due to the loss of some degrees of freedom there is a limitation in the performance of a task in a confined space, the range of motion of the instruments applied is restricted. Second, 3-dimensional imaging and orientation is displayed on a 2-dimensional screen. Some of these challenges are being addressed by many surgical navigation systems. Potential widening of the use of specialized image-guided techniques, including 3D-modeling and reconstruction of imaging data from computerized tomography, magnetic resonance imaging and ultrasound, may provide preoperative and intraoperative data acquisition. Modem head and neck disciplines were markedly improved by the introduction of new preoperative imaging techniques and intraoperative visualization too~s." l 8 Preoperative refined high-resolution imaging, e.g. computed tomography (CT) and magnetic resonance imaging (MRI), allows for improved evaluation of the regions of interests. Anatomical variations, the extent of fractures or pathologies and topographical relations to adjacent structures can be determined and be considered for surgical planning. Three dimensional surface or volume reconstructions allow a more detailed analysis of craniofacial anatomy. This tends to bridge the gap between radiology and surgery. Advanced radiographic techniques and computer-aided technology has become an important tool in the area of craniomaxillofacial surgery. Using three dimensional (3D)-reconstruction, measurement of anatomical positions, areas, distances and volumes can be performed easily, which provides the surgeon a feel for the geometry of the individual patient and a basis to objectively assess clinical conditions. In further studies it should be tested, to what extent various parameters, such as computed tomography (CT)-scan slice, thickness, reconstruction pitch and data sampling volume, may affect the accuracy of intraoperative registration. Intraoperatively, improved accuracy and safety is achieved using visual support by surgical endoscopes and navigation systems. However, there is still a gap in combining these different types of clinical useful information in the operation room. The prerequisite for successful minimally invasive surgery is the possibility to indicate at any given time the topographical relationship '9
73
150
between the surgical instruments using a three-dimensional coordinate measuring system based on preoperatively generated image data. Although the advent of endoscopic- and navigation-assisted technology is expanding in the fields of neurosurgery, skull base surgery, endonasal sinus surgery, spine and orthopaedic surgery, there have been few reports of the comprehensive clinical use of this technology in craniomaxillofacial surgery. Advances in computer graphics and system improvements may allow more detailed analysis and visualization of anatomical structures for surgical planning. Improvements of these applications, especially for craniomaxillofacial surgery, are resulting from advanced software modules, faster computer workstations and recent research results in strong conjunction with clinical surgical experiences. Advantages of the endoscopic- and navigated minimally invasive access include avoidance of extensive facial and oral incisions, rapid healing and a shortened hospital stay. These techniques can also complement traditional approaches to avoid the use of a second or third “open” incision. The intraoperative navigation technologies represent a significant extension for orientation, particularly were complex anatomical alterations are present or a minimally invasive procedure is planned. However, hrther modification to improve both practical modality and integration into complex working platforms are essential. Despite a few limitations, we believe that, with an expanding role and a rapid change in minimally invasive approaches, many common problems seen in the craniomaxillofacial surgical patient can be treated safely and effectively. Coupling of surgical navigation with various minimally invasive techniques, such as endoscopic or endoscopic-assisted surgery, will open new avenues of treatment for craniomaxillofacial questions.
References 1. M. J. Mack, Jama. 285,568 (2001). 2. P. Grunert, K. Darabi, J. Espinosa and R. Filippi, Neurosurg Rev. 26, 73 (2003). 3. J. K. Han, P. H. Hwang and T. L. Smith, Curr Opin Otolaryngol Head NeckSurg. 11,33 (2003). 4. J. E. Hausamen, J Craniommillofac Surg. 29,2 (200 1). 5. A. Raabe, R. Krishnan and V. Seifert, Surg Techno1 Int. 11,314 (2003). 6. N. C. Gellrich, A. Schramm, B. Hammer, S. Rojas, D. Cufi, W. Lagreze and R. Schmelzeisen,Plast Reconstr Surg. 110, 1417 (2002). 7. J. Hoffmann, F. Dammann, D. Troitzsch, S. Muller, M. Zerfowski, D. Bartz and S . Reinert, Biomed Tech (Bed). 41 Suppl 1 Pt 2,728 (2002). 8. Y. Ducic, J Otolaryngol. 30, 149 (2001).
151
9. 10. 11. 12. 13. 14. 15. 16. 17. 18.
M. Krimmel, C. P. Cornelius and S. Reinert, Int J Oral Muxillofuc Surg. 31, 485 (2002). C. Lee, M. H. Mankani, R. M. Kellman and C. R. Forrest, Facial Plust Surg Clin North Am. 9,475 (2001). H. K. Gumprecht, D. C. Widenka and C. B. Lumenta, Neurosurgery. 44,97 (1999). R. Marmulla, S. Hassfeld, T. Luth and J. Muhling, J Cruniomuxillofuc Surg. 31,267 (2003). D. Troitzsch, J. Hoffmann, F. Dammann, D. Bartz and S. Reinert, Zentrulbl Chir. 128,55 1 (2003). N. Sugano, T. Sasama, Y. Sato, Y. Nakajima, T. Nishii, K. Yonenobu, S. Tamura and T. Ochi, Comput AidedSurg. 6, 195 (2001). S. Nishihara, N. Sugano, M. Ikai, T. Sasama, Y. Tamura, S. Tamura, H. Yoshikawa and T. Ochi, JKnee Surg. 16,98 (2003). F. F. Eaves, 3rd, J. Bostwick, 3rd and F. Nahai, Clin Plust Surg. 22, 591 (1995). 0. M. Ramirez, Aesthetic Plust Surg. 18, 363 (1994). M. Caversaccio, L. P. Nolte and R. Hausler, Actu Otorhinolulyngol Belg. 56,51 (2002).
GEOMETRICAL CONTROL APPROACHES FOR MINIMALLY INVASIVE SURGERY MICAEL MICHELIN, ETIENNE DOMBFG, PHILIPPE POIGNET LlRMM - UMR 5506 CNRS / Universitk Montpellier 11, 161 rue Ada, 34392 Montpellier Cedex 5, France
This paper deals with the use of control algorithms that allow a robot to achieve motions under the constraint of moving through a fixed point. This work takes place in the context of minimally invasive surgery where the tool is telemanipulated by the surgeon through a penetration point: the trocar fixed on the patient. The algorithms are based on a geometrical description of the constraint, which is solved by an optimization procedure. They are experimentally validated on a 7 degrcc-of-frccdom robot.
1.
Introduction
In the twenty past years, minimally invasive surgery (MIS) became widespread in surgical operations. MIS consists in achieving operation through small incisions in the body. The surgeon uses dedicated instruments consisting in a long tube (30 - 40 cm) equipped with a tool at one end and a handle at the other. MIS adds difficulties in the surgical procedure. Indeed, the trocar reduces the tool orientation capabilities and inverts the motion direction. The friction in the trocar deteriorates the sensations. In the nineties, few master-slave surgical robotic systems that manipulate the surgical instruments have been developed. Their kinematical structure is designed to respect the entry point (trocar). Then, for any robot motion, the instrument is constrained to pass through the trocar. Zeus system makes use of a passive universal joint [l]. Da Vinci system and other prototypes such as FZK Artemis and UCBKJCSF RTW systems are designed as remote center devices [2 3 4 51. All these devices are dedicated to MIS. Finally, let us mention another way to create a fixed point consisting in the implementation of an appropriate force-position control. The tool is position-controlled within the patient while respecting a force constraint on the trocar [6]. In this paper, we propose a different approach based on algorithmic resolution of the constraint of passing the instrument through the trocar. Two algorithms are presented. In the first one, the spatial locations of both ends of the instrument are computed such that the instrument (considered as a line) coincides geometrically with the trocar (considered as a point) during the execution of any continuous trajectory. Experiments have been run on an
152
153 anthropomorphic arm, the PA- 10 from Mistubishi, a seven degree-of-freedom (do0 arm with revolute joints (Figure 3): the forearm of the robot is considered as an instrument equipped with a spherical wrist at the distal part. We have shown that the constraint of passing the forearm through a fixed penetration point could be geometrically solved when the tool tip attached on the wrist follows a complex path (no orientation). In the second algorithm, the constraint is described in terms of virtual mechanical joints as point-to-point and line-to-line contacts along the desired path. It has also been validated on the PA-10. This approach offers the advantage to be independent of the kinematics of the robot. These two approaches are summarized in Figure 1. Each one determines the adequate joint position from the position of the trocar and the desired position of the tool tip.
Pdesired
Geometrical respect of constraint
Figure 1: Principle of the two control algorithms
2.
First approach
2.1. Geometric description of the constraint
As abovementioned, let us regard the forearm of the PA-10 as an endoscopic instrument with a distal wrist on which is attached a tool (Figure 2.a.). The proximal end of the instrument is mounted on the “elbow” of a 3-dof robot (the three first dof of the PA-1 0). The constraint may be stated as follows: for any tool tip location along a path, compute the locations of the elbow and the wrist that keep the penetration point along the forearm. The constraint is satisfied when four conditions are realized. The first one is that the elbow and the wrist should be on a line passing through the penetration point. The second one is that the elbow should be located on a sphere centered on the shoulder, the radius of which being equal to the arm length. The third condition is that the wrist is on a sphere centered on the tool tip location, the radius of which being equal to the hand length. The last one is that the penetration point belongs to the elbow-wrist segment. These conditions can be expressed by the following set of equations:
154
where X is a vector whose components are the elbow coordinates in the task space (XI, X2, X,), the wrist coordinates (X4, X5, X6), and a scalar X7 that gives the relative position of the penetration point along the forearm. The coordinates of the penetration point are denoted by xp, yp and zp while those of the tool tip and the shoulder are denoted by xt, yt, z, and xs, y,, z, respectively. x,, ys, zsare design parameters of the PA- 10 robot. xp, yp and zp can be taught by the surgeon at the beginning of the surgical procedure. Finally, xt, yt, z, denote the coordinates of the current point along a given path. Elbow
xo Figure 2. a) Geometric constraint
b) Elbow location
Equations FI, F2 and F3 express that the penetration point lies on the forearm. Equation F4 states that the distance between the elbow and the wrist is constant. Equations F5 and F6 state that the wrist and the elbow remain on their respective spheres. Equation F7 expresses that the penetration point remains between the elbow and the wrist. We solve the system of equations (1) by an optimization procedure that implements the Levenberg-Marquardt algorithm [7]. The vector X contains information about the locations of the elbow and the wrist in the task space that respects the penetration point constraint. To control the robot accordingly, we need the corresponding joint positions.
155
2.2. Inverse geometric model The inverse geometric model is computed following three steps: in the first one, we calculate the first two joint positions (q,, 9 2 ) which move the elbow on the location resulting from the optimization procedure; in the second step, we calculate the third and fourth joint positions (q3, q4) which move the wrist to the location resulting from the optimization procedure, while ql and q 2 are fixed; in the last step, we calculate the fifth and sixth joints (q5, q6) which move the tool tip to the desired location, while the other joints are fixed. The elbow location depends on 91 and q2 only. Its z component expressed in frame & (Figure 2.b.) is given by: O z,
= r, + r, cos(q2)
(2)
(where subscript e stands for elbow) and then: 0
q2 = karccos(-
zc
-3
r3
(3)
The x and y components of the elbow location expressed in frame RO are given by:
yielding:
I
q, = a r c t a nO (Y 7C ) xe
(9; = 91 fx Let us assume that q1 and q2 are in a fixed position given by equations (5) and (3) respectively. Then we proceed to the calculation of q3 and q4 as it has been done for q1 and q2, the wrist location being now expressed in R2. Finally, we proceed to the calculation of q5 and q6, the tool tip location being expressed in &.Full computations are available in 171.
2.3. Experimental results The implementation has been made on the PA-10 under QNX real time operating system. The penetration point is materialized with four strings around the forearm (Figure 3). A complex path may be achieved by a series of elementary paths such as straight line, circle or helix. Combining such primitives, sewing has been
156
achieved with a 2 cm radius curved needle, which corresponds to a scale factor of ten compared with needles used in cardiac surgery. Since the PA- 10 is a 7-dof robot, the constraint reduces the tool dof number to five. This obviously limits the complexity of the achievable motion (the plane of the needle cannot be controlled along a path). Nevertheless it shows the feasibility and the interest of the approach.
Figure 3. The PA10 achieving a straight line by passing the forearm through a fixed point
3.
Second approach
3.1. Description of the constraint In this approach, an endoscopic instrument is mounted on the wrist of the PA10. For sake of simplicity, we do not consider a distal wrist at the end of the instrument since the algorithm would be the same. If it were the case, the orientation of the tool could be controlled independently. We describe the constraint in terms of virtual mechanical joints [8] defined between the instrument, the trocar, and the path regarded as a succession of points P" (Figure 4). The algorithm states that the position of the tool tip must coincide with the desired current position P", and that simultaneously the instrument must be aligned with the segmentjoining P" and the trocar position P". 3.2. Solution
e,
z,
If $ and are the components of the desired tool tip position and q and c are the components of the current one, the condition on the position of the tool tip is expressed by:
F,CQ)=E- K(Q)=O - q(Q)=O
F,(Q)=$
F;(Q)=E' - c(Q)=O where
Q
is the vector of the current joint positions of the robot.
157
To guarantee that the instrument is aligned with the trocar, the vector product A@B must be equal to zero, where A is the segment that connects P“ to P,and B is the segment that connects P to pw.This vector product is given by:
The components of A and B are:
Finally we obtain a six-equation system, which depends on the joint positions Q. The system is solved thanks to a Levenberg-Marquardt algorithm returning directly the joint positions.
,
Elbow
Forearm
Arm
Endoscopic instrument
Shoulder
Penetration point (trocart)
77b7 Figure 4. The robot manipulates an endoscopic instrument
3.3. Experimental results A Phantom 1.5 arm is used as a master device to generate the desired tool tip position of the PA-10 (Figure 5). The sampling frequency is 100 Hz for the PA10. The experiment is composed of two phases: 1) the tool tip of the robot is driven through the Phantom until the contact with the trocar is reached, then, the instrument is inserted (for this experiment, this step lasts twelve seconds as shown on Figure 6.b. 2) At this time we commute to the abovementioned control algorithm and, again through the Phantom, we realize random motions while satisfying the constraint of penetration point (from twelve to thirty seconds). It can be verified that the distance between the trocar and the instrument shaft is smaller than four millimeters. The tool speed is up to 0.3 m.s-
158 1
(Figure 7.a), the corresponding tracking error is less than 8 mm (Figure 6.a). The tool acceleration reaches 15 rn.s-’ (Figure 7.b).
Figure 5. Teleoperation under the constraint of penetration point
0.15
0.1 4
0.05
‘0
5
10
15
20
25
30
35
Time (second)
i
Figure 6 . a) Tracking error (m)
25.Yo 3;
15 -2;-
5 +o!
Time (second) b) Distance trocar-instrument (m)
20
I
15, 10 01
‘0
5 5
10
15
20
Time (second) Figure 7. a) Tool speed (m.s-’)
25
30
35
OO
5
10
15
20
25
Time (second)
b) Tool acceleration (m.s-’)
k 0
35
159 4.
Conclusion
We have presented two control algorithms that allow a robot to achieve motions passing an instrument through a fixed point. They are based on a geometrical task description and constraint satisfaction problem. The first one is dedicated to the PA-10 robot used in our laboratory. The second one is generic and can be used with any five and more degree-of-freedom robot. The performance of this strategy in terms of accuracy and velocity are satisfying. Further works will concern the introduction of haptic feedback.
References 1. http://www.comtwtermotion.com. 2. G. Guthart, and J.K. Salisbury, "The Intuitive Telesurgery System: Overview and Applications", Proc. IEEE Int. Con$ on Robotics and Automation, San Fransisco, 2000, pp. 61 8-621. 3. A.J. Madhani, G. Niemeyer, and J.K. Salisbury, "The Black Falcon: A Teleoperated Surgical Instrument for Minimally Invasive Surgery", Proc. IEEE/RSJ Int. Con$ on Intelligent Robots and Systems (IROS), Victoria B.C., Canada, October, 1998, pp. 936-944. 4. H. Rininsland, "ARTEMIS : a telemanipulator for cardiac surgery", European Journal of Cardio-Thoracic Surgery, 16: S106-S111 Suppl. 2, 1999. 5. M. C. Cavusoglu, W. Williams, F. Tendick, and S.S. Sastry, "Robotics for Telesurgery: Second Generation BerkeleyLJCSF Laparoscopic Telesurgical Workstation and Looking towards the Future Applications", Proc. 391h Allerton Con$ on Communication, Control and Computing, Monticello, IL, October 3-5,2001, 6. A. Krupa, C. Doignon, J. Gangloff, M. de Mathelin, L. Soler, and G. Morel, "Towards Semi-autonomy in Laparoscopic Surgery Through Vision and Force Feedback Control", Proc. Int. Symp. on Experimental Robotics, ISER '00, Waikiki, December 2000, pp 189-198. 7. M. Michelin, E. Dombre, P. Poignet, F. Pierrot and L. Eckert, "Path Planning under a Penetration Point Constraint for Minimally Invasive Surgery", Proc. IEEE/RSJ Int. Con$ on Intelligent Robots and Systems (IROS), Lausanne, October 2002. 8. W. Khalil, and E. Dombre, "Modelisation, identification and control of robots", Hermes Penton Science, London, ISBN 1-90399-613-9,2002.
PROSPECTIVE HEAD MOTION COMPENSATION BY UPDATING THE GRADIENTS OF THE MRT*
C. DOLD, EVELYN A. FIRLE, G. SAKAS Fraunhofer Institute for Computer Graphics, IGD Fraunhoferstr. 5, 0-64283 Dannstadt, G E R E-mail: Christian.
[email protected]. de
M. ZAITSEV, 0. SPECK, J. HENNIG University Hospital Freiburg Hugstetterstr. 55, 0-79106 Freiburg, G E R B. SCHWALD Computer Graphics Center, ZGDV Fraunhoferstr. 5, 0-64283 Dannstadt, G E R
Subject motion appears to be a limiting factor in numerous magnetic resonance imaging applications. For head imaging the subject’s ability to maintain the same head position for a considerable period of time places restrictions on the total acquisition time. For healthy individuals this time typically does not exceed 10 minutes and may be considerably reduced in case of pathology. In particular, head tremor, which often accompanies stroke, may render certain high-resolution 2D and 3D techniques inapplicable. Several navigator techniques have been proposed to circumvent the subject motion problem. The most suitable for head imaging appear to be the orbital or spherical navigator methods. Navigators, however, not only lengthen the measurement because of the time required for acquisition of the position information, but also require additional excitation RF pulses to be incorporated into the sequence timing, which disturbs the steady state. Here we demonstrate the possibility of interfacing the MR scanner with an external optical motion tracking system, capable of determining the object’s position with submillimeter accuracy and an update rate of 25Hz. The information on the object position is used without time penalty to update the position of the imaging volume during the acquisition of k-space data. Results of rotation phantom and in vivo experiments and the implementation in one MRI sequence are shown.
*Work partially supported by grant IST-2000-28168 of the Information Society Technologies programme.
160
161
1. Introduction
Patient motion remains a significant problem in many magnetic resonance imaging (MRI) applications, including fMRI ’, cardiac- and abdominal imaging as well as conventional long TR acquisitions. Functional MRI (fMRI) is a non-invasive imaging technique that is used to investigate cerebral function. Many techniques are available to reduce or to compensate bulk motion effects, such as physiological gating, phase-encode reordering, fiducial markers, fast acquisitions, image volume registration, or alternative data acquisition strategies such as projection reconstruction, spiral and PROPELLER 2 . Navigator echoes are used to measure motion with one or more degrees of freedom ’. The motion is then compensated for either retrospectively or prospectively. An orbital navigator (ONAV) echo captures data in a circle in some plane of k-space, centered at the origin 4 ,5 . This data can be used to detect rotational and translational motion in this plane, and to correct for this motion. However, multiple orthogonal ONAVs are required for general 3D motion determination, and the accuracy of a given ONAV is adversely affected by motion out of its plane. Methods capable of correcting for head motion in all six degrees of freedom have been proposed for human Positron Emission Tomography (PET) brain imaging 6. These methods rely on the accurate measurement of head motion in relation to the reconstruction coordinate frame. Implementing a similar technique in MRI presents additional challenges. Foremost the tracking system and the MRI system have to be compatible. High magnetic fields 2 1.5 Tesla in MRI systems require that the tracking camera system be positioned at a sufficient distance from the MRI system to ensure proper function and safety. Functional MRI also proves challenging because of the high spatial accuracy (RMS <0,3mm) required by the complete measurement chain with a small latency time of the tracking system. Determining a precise relationship between the spatially varying magnetic field gradients and the spatial tracking information is necessary to compensate for motion artifacts. First trials using an external tracking system to compensate movement artifact are published in 7.
2. Material and Methods The technique was implemented on a Siemens Magnetom Trio 3T wholebody system (Siemens Medical Systems GmbH). The EOS tracking system developped by ZGDV has been adapted to the requirements for the MRIMARCB project.The EOS optical motion tracking system has been based
162 on a stereoscopic reconstruction of rigid bodies from gray scale images. The tracking system reports the position and orientation of rigid targets fitted with “passive” retro reflective markers in six degrees of freedom (6DOF) using two progressive scan cameras synchronized by a frame grabber card in a standard PC ’. The cameras were equipped with infrared lenses to block the visible light and the scene was illuminated with infrared light beamers attached to the cameras. Several targets can be tracked simultaneously at a sampling rate of up to 25 Hz, with a quoted positional accuracy less than 0.3 mm (RMS) in this setup. Communication with the MR Image reconstruction computer takes place over a TCP/IP connection. Passive targets consist of at least three coplanar retro reflective markers. All markers are filled with doped water to be detectable for both MR and the tracking system. The point of origin of the tracking system is transformed to the physical center of the gradient system to overlap both coordinate systems. The tracking system returns, in the overlapped coordinate frame, target orientation as a unit quaternion and target position as a three-dimensional vector. The coordinate system of the Magnetic Resonance Tomograph (MRT) is controlled using magnetic field gradients and frequencies in the sequence. In vivo imaging experiments were performed in healthy volunteers. All experiments with human subjects were performed in accordance with local IRB regulations; informed consents were obtained prior to measurements.
Figure 1. The setup of the EOS tracking system in front of the M R scanner.
163 2.1. Prospective Rotation Phantom Tests
A defined rotation is realized by a rotation phantom in 10 one-degree steps (fig. 2). Every translation and rotation value (6DOF) is stored in a log file by EOS. MR images are acquired with and without motion correction. The motion correction is performed prospectively by updating the slice positions and orientations based on the tracking information.
Figure 2. Experimental setup to measure rotations: The phantom is attached to the plate, which enables to rotate the setup in a controlled way in increments of 1 degree.
2.2. Echo Planar Imaging (EPI) Sequence
The EPI sequence was modified to enable real-time slice-by-slice feedback from the image reconstruction computer. The latter was communicating with the optical motion tracking system and generating the feedback information for the measurement. Imaging parameters were: FOV=256mm, 642 image matrix, 32 slices, 4mm slice thickness, 1.2mm slice gap, interleaved multi-slice acquisition, TE=16ms, TR=1660ms. In order to track the position of the subject’s head a mouthpiece with 3 retro-reflective markers was used. Subjects were instructed to bite the mouthpiece tightly to make sure it remains in contact with the upper jaw in order to make the system of 3 markers and the scull a rigid body. No head fixation pads were used in order to enable exaggerated motion during the imaging experiments.
3. Result The positional accuracy was better than 0.1 mm (RMS). The mean value of the measured reproducibility of the rotation and translation was 0.008 degree/0.067 mm. No artifacts were detectable in the MR images originating from interactions of the MR system and the tracking system. The latency
164
A volunteer with a mouthpiece inside the head-coil but outside of the scanner.
Figure 3.
of the whole measurement chain including the reconstruction PC was less than 25ms. The prospective slice-by-slice motion correction avoids unre-
4071 .",?
D
'
\I'
I
,
,
,
3
b
9
, , 12 1s
'I'
I
I
, !3
?i
,
,
Y 'U
, 24
,
,
2:
3
0 3 6 0 I2 15 I 6 71 24 27 30 33 3 n e degree rola?mnal ~ P L ~ Mw N mird mearv:menl
Figure 4. The translation and rotation values determined by EOS without phantom motion are shown left top and right top. The bottom figure represents the EOS data for the 10 rotation steps. The time to acquire these data was twelve minutes.
coverable volume distortions (fig. 6 left). The remaining modulation may be attributed to EPI geometric distortions, which are known to be positiondependent. In experiments with less exaggerated head movements (f3mm,
165
Figure 5 . The original (left) and the motion compensated image after rotation by u p dating the MRT with the data from the tracking system (middle). On the right side no motion correction is activated in the MRT. The 10-degree motion is clearly visible.
f 3 degree) these modulations are not visible. The data demonstrate advantages of optical motion tracking to correct for motion in echo-planar time series by not only enabling volume-to-volume correction without additional computational overhead, but also slice-by-slice correction without increasing the scanning time. 4. Discussion
The results show the feasibility of an external optical motion tracking system to correct for subject motion during the acquisition of a single image on a line by-line basis. Even though minor image quality reduction due to the noise in the position data delivered by EOS was observed, the results are good enough to motivate the efforts of performing prospective motion correction. The effect of random inaccuracies in the position data on the statistical properties of the measured images is still to be evaluated. The remaining artefacts appear to be due to the limitations of the current implementation of the positional feedback, where the motion information is received by the reconstruction computer and then transferred to the measurement control unit, which results in the lag of M 1TR (independent of TR). The effects are much reduced in phantom experiments, where it is possible to induce “discrete” motion (fig. 5). Expected benefits of this technique include a significant reduction in imaging time because it avoids the need to repeat acquisitions corrupted by motion. The technique also may increase the quality and statistical power of functional MRI, which is beneficial for surgical planning and neuroscience research. In fMRI studies these 3D prospective motion correction might allow to use all data from non-cooperative subjects even during periods of fast head motion within the acquisition of one volume. Furthermore, this technique will help to increase the clinical efficiency, patient throughput and
166
Figure 6. Representative coronal images produced by reslicing transversal EPI data from the interleaved multi-slice acquisition. Severe displacements between the neighboring slices are apparent when motion occurs during the acquisition of a single volume (left). Adjusting the position of each slice circumvents the intra-volume distortion problem (right). The remaining modulation and/or step-like structures a t the surface of the brain are due to geometric distortions of EPI, which can change with the orientation of the head. The position of the head as a function of the acquired slice number is displayed on the right. The motion corresponds to a single continuous head rotation of yaw axis.
167 20
15
-
10 tr
8
5 z
0
0
5
to
I5
I0
61tce number (acqulsmn)
I,
30
0
5
(0
IS
20
25
10
slice number (acquisieon)
Figure 7 . Measured 6DOF of the head, motion correction off/on are shown left/right.
reduce measurement redundancy. In the near future the proposed method may be used to examine non-cooperative patients for head acquisition, e.g. children or demented subjects.
Acknowledgments The research work is funded by the European Union IST-2000-28168 (Information Society Technologies) program. The project is named MRI-
MARCB. References 1. Babak A. Ardekani et al: A quantitative comparison of motion detection algorithms in fMRI. Magn. Reson. Imag. 19, 959 (2001). 2. Pipe JG et al.: Motion correction with PROPELLER MRI: application to head motion and free-breathing cardiac imaging. Magn. Reson. Med. 42, 963 (1999). 3. Thesen S et al.: Prospective acquisition correction for head motion with imagebased tracking for real-time MRI. Magn. Reson. Med. 44, 457 (2000). 4. Fu ZW et al.: Orbital navigator echoes for motion measurements in magnetic resonance imaging. Magn. Reson. Med. 34, 746 (1995). 5. Welch EB et al.: Spherical Navigator Echoes for Full 3D Rigid Body Motion Measurement in MRI. Magn. Res. Med. 47, 32 (2002). 6. Roger R. Fulton et al.: Correction for Head Movements in Positron Emission Tomography Using an Optical Motion-Tracking System; IEEE Trans Nucl Sci 49, 116 (2002). 7. Christian Dold et al.: The Compensation of Head Motion Artifacts using an Infrared llacking System and a new Algorithm for fMRI. Medicine Meets Virtual Reality 12, 75 (2004). 8. Schwald Bernd et al.: A Flexible Tracking Concept Applied to Medical Scenarios Using an AR Window. Proceedings of the International Symposium on Mixed and Augmented Reality, ISMAR, 261 (2002).
This page intentionally left blank
Calibration and Accuracy Analysis
This page intentionally left blank
NON-INVASIVE INTRAOPERATIVE IMAGING USING LASER RADAR IMAGING IN HIP JOINT REPLACEMENT SURGERY* G m T E R KOMPA University of Kassel, Department of High-Frequency Engineering, WiVilhelmshoher AIIee 73,D-34121 Kassel, Germany
GEORGE KAMUCHA' Moi University, Department of Electrical & Communication Technology, Eldoret, Kenya
This paper presents a clinically tested scanning pulsed lascr radar for the imaging of the acetabulum in hip joint replacement surgcry for registration purpose. The radar is a classI-laser system and thcrefore totally eye-safe. At a measurement distance of about 1 m the measurement uncertainty has proven better than 1 mm in all three x-, y- and z-directions. Measurement time of 3.8 min has been achieved for a scanning area of 40 mm x 40 mm including averaging over eight individual measurements for each image point.
1.
Introduction
Medical images from the commonly used diagnostic tools in surgical applications, such as computed tomography and magnetic resonance, provide highly accurate and informative insight into internal anatomical structure and function, but typically contain no information on the coordinate system of the patient during operation. In a conventional surgical procedure, correspondences between the medical images and the position of the patient in the operation room are drawn mentally by the surgeon. The clinical outcome of such an operation is therefore much dependent on the surgical skill and the state of mind of the surgeon during the surgical intervention. These limitations have been highly minimized by the emergence of Computer Assisted Surgery (CAS), in which a computer system accurately controls the execution of an operation, whose plan is often generated from preoperative images. A key problem in CAS is registration - the process of aligning datasets to the same coordinate frame [ 13. This paper presents a new non-invasive registration technique using a high resolution pulsed laser radar system in computer assisted hip joint replacement surgery. The procedure is illustrated in Fig. 1. In the preoperative planning phase * This work was performed within the European Research Project INTAS
(Project Reference: INTAS-5 1615009). formerly with the Department of High-Frequency Engineering, University of Kassel 171
172 MRI Images
Procedure Plan
I_I_c
i
-8
~
ii
E
f
adar lniaginy
-g-
7
SiJrface-based RegisIration
i___iSpal,ally Reconcile
4
3
1
Pldn wlth Execution
c
/
c
-r 0
Patient
Procedure Execution
Figure 1. Ncw laser radar bascd registration technique.
images from the hip joint are obtained from X-ray computed tomography (CT) or magnetic resonance imaging (MRI). On the basis of the recorded images an operation plan is developed. Then in the operation room the spatial orientation of the hip joint is determined (registration). After registration the preoperative plan has been modified and matched to the actual position of the patient so that execution can take place. Various registration methods are known, e.g. X-ray imaging is counted among anatomy-based registration approaches. Although this method is noninvasive, the patient is exposed to hazardous radiation. Widely used invasive fiducial-based registration uses implanted pins or screws as fiducial marks before final operation. In this paper a new registration procedure is presented which is harmless and non-invasive and based on a pulsed laser radar for precise surface-based registration. 2.
Pulsed laser radar system
A simplified block diagram of the laser system is shown in Fig. 2. Detailed description of the system is given e.g. in [2]-[6]. Key features demanded and achieved for the discussed application are: i) ranging uncertainty less than 1 mm using averaging over eight measurement results for a given image point, ii) measurement uncertainty of the scanning device less than 1 mm in x- and ydirection at a measurement distance of about 1 m, iii) laser safety using extremely short laser pulses from SH laser diode LD60 with an emission area of 76 pm x 2 pm, typically 128 W optical output power and 44 ps pulse width (FWHM). The pulse repetition rate (PRF) is 80 kHz. Thus, the laserdiode emits an averaged power Popt= PpeakzpPRF = 0.45 mW which is fully within the laser class 1 permissible limit value.
173
Focussng Optics
Figure 2. Block diagram of laser radar used for registration.
3.
Registration algorithm and verification
For registration, volumetric preoperative CT and MRI datasets have been used to derive a surface model shape. h4EU was preferred for clinical trials because of superior soft-tissue contrast which provides for the extraction of cartilage on acetabulum. Intraoperative laser radar based surface datasets provide the data shape. Registration aims at the alignment of the two corresponding datasets, i.e. data and model shape, using an optimization algorithm. The standard iterative closest point algorithm (ICP) was applied for registration with some modifications, such as introducing threshold-based outlier detection technique, suppressing outliers, using random perturbation for escaping from local minima near global minimum, and selecting corresponding three surface points to start the optimization process to avoid trapping into distant local minima.
Figure 3. Surface of cow’s cadaver socket bone (left: extracted from CT data, right: extracted from laser radar data) and identified corresponding anatomical landmarks for initial point registration.
174
The performance of the registration algorithm was scrutinized. Algorithm’s efficiency was performed on the basis of synthetic datasets. The model shape was a computer generated prism and the data shape was a transformed model with given rotation offset. Measurement uncertainty was investigated on the basis of a precisely manufactured prism with known geometric dimensions (model shape) and on surface data measured by the laser radar (data shape). The measurement reliability was analysed by using a CT image of a cow’s cadaver (model shape) and laser radar surface data (data shape). Finally, proof of clinical suitability was based on the MRI image of acetabulum and surface of the acetabulum by intraoperative measurement with the laser radar. Detailed results on the reliability and robustness of the developed registration algorithm also coping with spurious signals from the surgical site are presented in [l]. Therefore only exemplary results are presented here to indicate the excellent performance and usefulness of the developed imaging system. Fig. 3 shows the CT based dataset of a cow’s cadaver socket bone and the corresponding surface model measured by the laser radar. To avoid distant local minima during optimisation process, three anatomical landmarks for corresponding point registration were identified. After registration processing the distance difference of CT data and measured laser data were found to be largely within 1 mm over the scanned area. Outliers with differences above 1 mm prove only 5 percent of the whole dataset. After removal of outliers, a maximum value of distance difference of 0.92 mm was obtained, and the uncertainty in registration resulted in 10-4 mm (translation), and degrees (rotation). 4.
Clinical test
After intensive and promising investigations in lab environment, the system should prove itself during hip joint operation executed in the orthopaedic clinic of Kassel. Preoperative data of the patient were attained with a 1.5 T MR imager (fat-suppressed T1-weighted 3D gradient echo pulse sequence, resolution: l x l x l mm) from the Clinic Center (Kassel). Fig. 4 shows the extracted 3D surface of the patient’s acetabulum.
175
Figure 4. Extracted 3D surface of the acetabulum of a patient planned for hip joint replacement.
Figure 5. Opening of the patient’s hip joint after dislocation
176
Figure 6. Intraoperative setup of patient and laser system during scanning.
During the actual surgical operation, the surgeon generally separates the muscles, ligaments and tendons to access the hip joint. The hip joint is then dislocated by separating the head of the femur from the acetabulum. Fig. 5 shows the opening of the patient’s hip joint after dislocation. The metallic retractors are used to enlarge the acetabular exposure. The positioning of the laser radar can be seen from Fig. 6. The measurement distance was 85 cm and the scanned area 40 mm x 40 mm. For measurements, tissue was carehlly removed from the outer edges of the acetabulum as well as from the acetabular fossa which is perforated by numerous apertures, and lodges a mass of fat. Furthermore, nerves and blood vessels supplying the joint with nutrients hampered the measurement. Since these tissues get tom during the process of separating the femoral head from the acetabulum, there was a need to remove them, where possible, before collecting surface data using the laser radar system. The dynamic of the received radar signals was moderate. Signal amplitudes varied only marginally between 35 mV and 67 mV. Thus, also at sharply inclined bone surface sufficiently large radar signals could be observed. The experience has shown that wet slanted bone surfaces provide better reflectance than dry bone surfaces with partly specular properties.
177
Figure 7. Intraoperative dataset attained by laser radar.
Fig. 7 shows the acetabulum measured by the laser radar system (resolution of surface points: 1 mm x 1 mm). Registration was carried out using the measured radar based surface points as the data shape and the extracted surface fiom the MRI data as the model shape. A rough initial alignment was obtained by first identifying three anatomical landmark points from the MRI model of the acetabulum and the corresponding landmark points from the laser radar based surface shape. The resulting transformation estimate was then used to start the ICP algorithm. Fig. 8 shows the laser surface points overlaid on the MRI model after the convergence of the registration algorithm and evaluation of the distance differences. Black points represent distance differences less than lmm. The grey points are within 1 to 3 mm, and the white squares (post-processed for better visibility) represent differences greater than 3 mm. The laser radar measured surface comprises 1474 points and covers about 71 percent of the acetabulum’s surface. Out of these surface points, 633 points were found to have a distance difference beyond 1 mm and were regarded as outliers in the subsequent ICP runs. 186 points had distance differences above 3 mm. It has been found that the majority of the large residues lies on the regions where spurious soft tissues
178
Figure 8. Patient’s laser radar measured surface points overlaid on the MRI surface after the first registration trial.
............................................
...
10
20
30
40
50
80
....................
70
80
90
100
Percentageof Outlier Removed
Figure 9. Change in rotation as function of percentage of outlier points removed.
would be found during intraoperative laser radar imaging, i.e. on the rim of the acetabulum and at the acetabular fossa. 57.3 percent of the outliers, including points with the largest residues, were found to lie on the acetabular fossa. Fig. 9 shows the change in rotation as a function of the percentage of outlier points eliminated fiom the laser data set. It could be shown that complete removal of the laser data points from the region of the acetabular fossa has strong influence on the registration parameters. The remaining outliers cause a deviation of only 0.0054 degrees. As reported in [l] a deviation of 0.0127 mm is found for the translation. Simulations of extreme impairment of measurement in clinical environment has shown that maximum expected registration errors would be 0.046 degrees and 0.49 mm in rotation and translation respectively.
179
5. Conclusions A 3D laser radar system was presented suited for the registration in hip joint replacement surgery. The assembly is compact, mobile and packaged for sterile clinical assignment. A modified robust ICP registration algorithm was developed and tested by means of synthetic and measured datasets. The minimum measurement time of 3.8 minutes has proved acceptable for practical clinical use. From clinical trials it has turned out that coverage of 71 percent of the acetabulum is sufficient for successful registration. Most of the outliers (57.3 percent) were located at the acetabular fossa. Outlier removal drastically improves the registration uncertainty. Rotation and translation uncertainties of 0.0054 degrees, and 0.0127 mm respectively, have been attained. From simulation of extreme disruption of measurement, registration errors of 0.046 degrees and 0.49 mm are expected. The laser radar based model shaping depends entirely on the surface integrity of the part of the body under consideration. Based on the segmentation of 3D MR images it has been shown that cartilage on the hip joint could be considered in the registration process. The successful application of the presented registration method needs full access of the laser beam to the opened hip joint. Restrictions have occurred with strongly adipose femoral. Regarding further research, laser radar imaging seems to be very promising for knee joint registration application. Acknowledgments Clinical trials offered by Prof. W. Siebert and Dr. B. Schlangmann, both with Orthopaedic Clinic, Kassel, and high resolution MR images delivered by Prof. F. Kuhn and Dr. P. Reuter, both with the Clinic Center, Kassel, are greatly acknowledged. Sincere thanks are also given to J. Weide for the preparation of graphical material.
References 1. G. Kamucha, Doctoral Thesis, kassel university press (2004). 2. G. Kompa, ZEEE Trans.Insb-um. Mem. 33,97 (1 984). 3. G. Kompa, V. Gorfiiel, J. Sola, A. Stoke, W. Vogt and F. Volpe, 26IhEuMC Proc., 147 (1996). 4. A. Biemat and G. Kompa, Optical and Quantum Electronics 31,981 (1999). 5. G. Kamucha and G. Kompa, 3flhEuMC Proc., 321 (2000). 6. G. Kamucha and G. Kompa, Proc. ODlMAP ZIZ, Pavia Univ., Italy, 105 (2001).
ACCURACY IN COMPUTER ASSISTED IMPLANT DENTISTRY. IMAGE GUIDED TEMPLATE PRODUCTION VS. BURR TRACKING GERLIG WIDMANN Interdisciplinary Stereotactic Intervention- and Planning Laboratory (SIP-Lab), Department of Radiology I (Chairman Pro$ W. Jaschke), University Innsbruck, Anichstr. 35,6020-lnnsbruck, AUSTRIA ROLAND WIDMANN Dental Laboratory Czech, Kreuzstr. 20, 6067-Absam. AUSTRIA EKKEHARD WIDMANN, MD Dental Practice Dr. Widmann, Unterauweg 7a, 6280-Zell/Ziller, A USTRIA
RETO JOSEF BALE, MD Interdisciplinary Stereotactic Intervention- and Planning Laboratory (SIP-Lab), Department of Radiology I (Chairman Pro$ FV Jaschke), University Innsbruck, Anichstr. 35,6020-Innsbruck, A USTRIA
The accuracy of a new concept for imagc guided template production (SIP-LAB IGTP) was tested in a phantom study. A complete set of dental casts was prcpared with lead pellcts. A CT-scan with an individual VBH vacuum mouthpiece (Medical Intelligence, Germany) and an external registration frame attached to the dental cast was performed. Using the optical based Treon navigation system (Medtronic Inc., USA), a surgical path to the target pellets was planned and an aiming device was adjustcd according to the plan. Burr tubes were positioned by a metal rod advanced through the aiming device and glued into prefabricated templates by blue light curing resin. Drillings were made through the templates on the dental casts. On the 3D - CT-data of the drilled dental casts, the accuracy in the xy-axis was determined as normal deviation to the defined target. As the duplicated dental casts lack lead pellets, image fusion to the CT-data of thc dental casts with lead pellets was necessary. The accuracy in the z-axis was measured separately. Showing a mean accuracy [xy] of 0.48 mm +/- 0.35 mm, [z] 0.25 mm +/- 0.12 mm, similar accuracies compared to burr tracking could be achieved. However a safety distance of 1.5 mm is necessary to reliably avoid damaging critical anatomical structures like the mandible nerve or the maxillary sinus. The SIP-LAB IGTP technique allows for robot assisted positioning of burr tubes, which may lead to even better results. Regarding costs and effort, the SIP-LAB IGTP technique realized at hospital based planning centers is an attractive alternative to burr tracking systems especially for dentists and oral implantologists working in private practice.
180
181
1.
lntroduction
Computer aided dental implant positioning can be realized with two different approaches: image guided burr tracking and image guided template production (IG TP). Although several studies on the accuracy of burr tracking exist, there is only rare data on the accuracy of image guided template production. In this paper, we present an accuracy analysis of our Interdisciplinary Stereotactic Intervention- and Planning Laboratory Innsbruck (SIP-Lab) IGTP technique [ 11 and a comparison to image guided burr tracking.
2.
Methods
A functional dental impression of one of our co-workers was used to build a complete standard set (maxilla, mandible) of class 111 dental stone casts. In order to get CT-visible targets, a total of 56 holes were drilled laterally near the apex of the hypothetical dental roots and lead pellets were glued into the holes. The dental casts were mounted on the SAM I1 articulator system (SAM Prazisionstechnik GmbH, Germany). A Vogele-Bale-Hohner (VBH) -mouthpiece (Medical Intelligence GmbH, Germany) [2,3] was prepared with dental impression resin and both, a maxillary and a mandibular impression was made with the help of the articulator. The VBH-mouthpiece and the SIP-Lab external reference frame containing 7 CT markers were attached to the dental stone cast [4] and axial computed tomography of the dental casts was performed in the Spiral-CT Light Speed QWi (GE Medical Systems Inc., USA] with standard parameters for dental CT (Fig. 1). The CT-data was transferred to the PACS of the University Hospital Innsbruck in DICOM format and to the Treon optical navigation system (Medtronic Inc., USA).
182
Figure 1. CT-scan of the dental casts with the VBH-mouthpiece and the attached reference frame.
The planning set up consisted of a base plate (Fig. 2A), an arm to fix the unmodified dental cast (Fig. 2B), a target device (Fig. 2C) and a reference arm for the optical navigation system (Fig. 2D). Registration of the planning set up was done via the reference frame attached to the VBH-mouthpiece. A surgical path with the entry in the centre of the dental crown and the target in the centre of the lead pellet on the CT-data was planned and the aiming device was adjusted with the navigation system according to the plan (Fig. 2).
183
Figure 2. Set-up with aiming device introduced into the targeting device.
Guided in the aiming device, burr tubes were positioned by a metal rod and glued into prefabricated templates using blue light curing resin. Drillings were made through the templates on the dental casts with lead pellets as well as on a second set of duplicated casts without containing lead pellets.
184
3.
Results
The accuracy was evaluated on the 3D CT-data of the drilled dental casts. A path with its entry point in the middle of the proximal drill hole and the target point in the middle of the distal drill hole was drawn and virtually elongated to the centre of the target balls. In the “probe’s eye view” the accuracy in the xyaxis was ascertained as the normal deviation to the defined target (Fig. 3).
Figure 3. Acuracy in the xy-axis.
Due to the fact that the duplicated dental casts lack lead pellets, image hsion of the planning CT-data of the dental casts with lead pellets was
185 performed. As the duplicated dental casts are a precise copy of the original this was possible using the markers on the reference frame. In “Blend Setting Mode 0”, the drilled holes were drawn on the CT-data of the drilled duplicated dental casts and faded in the CT-data of the dental casts with lead pellets in “Blend Setting Mode 100”. The accuracy was evaluated using the above described method. A total of 56 drillings were evaluated. The mean accuracy of 28 drillings on the dental casts with lead pellets was M(p)[xy] 0.48 mm and SD(p)[xy] 0.35 mm. The maximum deviation was Max(p)[xy] 1.2 mm. The following 28 drillings on the duplicated dental casts without lead pellets showed a mean accuracy of M(d)[xy] 0.5 mm and SD(d)[xy] 0.34 mm. The maximum deviation was Max(d)[xy] 1.2 mm. In addition, to determine the accuracy in the z-axis, 50 point measurements to a predefined target were performed through the aiming device and showed a mean deviation of M[z] 0.25 mm and SD[z] 0.12 mm. The maximum deviation was Max[z] 0.6 m. (see Figure 4) Accuracy of SlP-lAB IGTP
I
i standard deviation
Meanpy] S P L A B (p)
wan[xyl SPLAB (d)
1
Man[zJ SIP-LAB
M3xpyI
Maxpyl
SP-LAB(p) SPLAB (d)
M3x[zl SIPLAB
Figure 4. Accuracy of SIP-Lab IGTP (p = casts with lead pellets, d = casts without lead pellets).
4.
Discussion
Brief et al. [5] investigated the accuracy of DENX-IGI burr tracking (DenX Ltd, Jerusalem, Israel). In 38 drillings, DENX-IGI showed a mean accuracy of M[x] 0.6 mm, M[y] 0.3 mm and a maximum deviation of Max[x] 1.1 mm and Max[y] 1 mm. The accuracy in the z-axis was M[z] 0.2 mm and Max[z] 0.7 mm. Compared to DENX-IGI similar results were found for SIP-Lab IGTP (Fig. 5).
186
Brief et a/. DENX-IGI v s SIP-LAB IGTP
i
Figure 5. Accuracy of DENX-IGI (dark) vs. SIP-Lab IGTP (light)
Both different methods provide sufficient accuracy for the use on patients. However, a minimum of 1.5 mm safe distance is necessary to reliably avoid damaging critical anatomical structures like the mandible nerve or the maxillary sinus. In the near future the SIP-LAB IGTP technique may be combined with a robotic system which might provide even better results. As the dentist does not need to buy a navigation system and there is no need for intraoperative registration, SIP-LAB IGTP realized at hospital based planning centers is an attractive alternative to burr tracking systems especially for dentists and oral implantologists working in private practice.
References
1. 2. 3.
4. 5.
G. Widmann, MD-thesis. (2003). R. J. Bale, J. Burtscher, W. Eisner, A. Obwegeser, M. Rieger, R. A. Sweeney, A. Dessl, S. M. Giacomuzzi and W. Jaschke, J Neurosurg. 93:208-213 (2000). A. Martin, R. J. Bale, M. Vogele, A. R. Gunkel, W. F. Thumfart, W. Freysinger, Radiology. 208(1):261-5 (1998). R. A. Sweeney, R. J. Bale, R. C. Moncayo, K. Seydl, T. Trieb, W. Eisner, E. Donemiller, P. Lukas, Struhlenther Onkol. 179(4):254-60 (2003). J. Brief, S. Hassfeld, U. Sonnenfeld, N. Persky, R. Krempien, M. Treiber and J. Muhling, CARS 2001. 1230:739-747 (2001).
3D-ACCURACY ANALYSIS OF FLUOROSCOPIC PLANNING AND NAVIGATION OF BONE-DRILLING PROCEDURES J. A. K. OHNSORGE’.~,M. WEISSKOPF’ I
Orthopadische Universitatsklinik , Rheinisch- Westfalische Technische Hochschule Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany
E. SCHKOMMODAU2 Institut f i r Biomedizinische Technologie - I B d , Rheinisch- Westfalische Technische Hochschule Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany
J. E. WILDBERGER3 Klinikfir Radiologische Diagnostik3, Rheinisch- Westfilische Technische Hochschule Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany A. PRESCHER4 4
Institut fur Anatomie , Rheinisch- Westfalische Technische Hochschule Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany
C.H. SIEBERT’ Orthopadische Universitatsklinik’, Rheinisch- Wes@alische Technische Hochschule Aachen, Pauwelsstrasse 30, 52074 Aachen, Germany
Many orthopaedic procedures require an accurate drilling in bone. The outcome is frequently dependant on the geometric accuracy of this surgical step. It can be improved by computer-assistance. Reliability, accuracy and benefit of this new method for the patient, as well as for the surgical staff, was analysed in a series of experimental trials. The fluoroscopically assisted freehand-navigation enabled a highly accurate threedimensional tip placement when drilling in bone and decisively reduced radiation exposure to a minimum. It represents a promising and efficient application for a variety of procedures in orthopaedic surgery.
1.
Objectives / Background
Accurate drilling in bone is an essential of many orthopaedic operations. When a small necrotic area of the head of femur needs to be revascularised by bruising the sclerotic margin, when a screw shall be placed, or, especially, when it comes to take a biopsy from a tumor, the conventional 2D C-arm control is uncomfortable. The geometric accuracy of the usually free-hand executed drilling is likely to be improved by fluoroscopic navigation. Though a 187
188
volumetric model is not generated, 3D orientation can be provided by simultaneous display of more than one x-ra image at the time. Virtual imaging of the navigated instruments helps in planning as well as in executing the intervention. Reliability, accuracy and intraoperative applicability of this computer-assisted method need to be analysed. Only an evidence based benefit may justify additional expenditure of whatever kind. Radiation exposure and total operating time are expected to be reduced, but must be quantified before clinical application of the method. Besides, empiric revelation of defective details can possibly lead to decisive improvements.
Figure 1. Fluoroscopic navigation: Planning and execution of a drilling procedure at the hip with simultaneous display of multiple radiographic views.
2.
Design 1 Methods
In a standardised in vitro trial the drilling onto a 5 mm spherical target implanted in an artificial femoral head was performed using a navigated drillguide, a navigated drill and the FluoroNavTMsystem. In respect of the primarily aspired minimal-invasive character of the procedure a new dynamic reference base was developed avoiding additional trauma through its fixation to the bone.
189
In one group (A) the surgeon was supposed to place the drill in the middle of the BaS04-augmented plaster bullet. In the other group (B) he additionally was asked to follow a defined direction. For that purpose a retrograde canal of 15 mm length was drilled with a 2 mm 0 K-wire from the target towards the lateral aspect of the femur and then filled with the same plaster mixture. For both groups the distance of the tip of the drill to the center of the lesion was analysed in a 3D CT-generated model and in macroscopic cross section. In group B the direction of the actual drilling canal was measured relatively to the preformed.
Figure 2. CT-generated 3D model of femur and BaS04-augmented plaster bullet with projected drilling.
3.
Results
The mean distance in group A was measured to be 1 mm, with all results ranging between 0 and 2.5 mm and the lesion not being missed once. In group B the planned direction of the canal was reproduced with an deviation of 0" to 7", the center of the target (0 5 mm) only being missed by a mean distance of 2,5 mm and a maximum of 3,5 mm. The mean distance fault of both groups together
190
was calculated 1.9 mm representing a proportional error of 2 % regarding the 8.7 mm average length of the drilling canal. Compared to the macroscopic and 3D-CT-findings, the correlation of the data calculated by the navigation system was accurate up to a maximum divergence of 4" or 2 mm in single cases. The virtual image of the effected drilling and the previously planned trajectory due to the manual performance differed from 0 to 2 mm and 3" respectively. Radiation exposure was reduced to just a few x-ray images compared to an up to ten times higher amount caused by conventional permanent c-arm control, whereas there was no diversity in the total operating time.
Figure 3. DISOS divergence.
4.
-
3D accuracy analysis of the drilling procedure displaying distance and
Conclusion
The fluoroscopically assisted freehand-navigation provides high threedimensional accuracy reducing radiation exposure to a minimum. With the new DRB computer-assistance makes sense also for percutaneous procedures and represents a promising and efficient application for a greater variety in orthopaedic surgery.
191
Acknowledgments The work had been supported by Medtronic SNT that provided the Stealth Station@and the FluoroNavTMsystem. Evaluation was possible by the use of DISOS - Desktop lmageprocessing System for Orthopaedic Surgery, established at the IBMT.
References 1. Hofstetter R, Slomcykowski M, Sati M, Nolte LP. Fluoroscopy as an imaging means for computer-assisted surgical navigation. Comput Aided Surg 1999; 4:65-76. 2. Foley KT, Rampersaud YR, Simon DA. Virtual Fluoroscopy. Operative Technique in Orthopedics 2000; 10:77-81. 3. Ohe T, Nakamura K, Matsushita T, Nishiki M, Watanabe N, Matsumoto K. Stereo fluoroscopy-assisted distal interlocking of intramedullary nails. J Orthop Trauma 1997 May;l1:300-3. 4. Brack CH, Burgkart R, Roth M, Schweikard A. Accurate X-ray Navigation in Computer-Assisted Orthopaedic Surgery. Computer Assisted Radiology and Surgery (CARS) 1998, June 24'h-27th,Tokyo, Japan. 5. Van Tiggelen R. Since 1895, orthopaedic surgery needs X-ray imaging: a historical overview from discovery to computed tomography. JBR-BTR 200 1;84:204-13. 6. Norris BL, Hahn DH, Bosse MJ, Kellam JF, Sims SH. Intraoperative fluoroscopy to evaluate fracture reduction and hardware placement during acetabular surgery. J Orthop Trauma. 1999;13:414-7. 7. Hufner T, Pohlemann T, Tarte S, Gansslen A, Citak M S M, Bazak N, Culemann U, 'Nolte LP, Krettek C. Computer-assisted fracture reduction: novel method for analysis of accuracy. Comput Aided Surg 2001; 6:153-9. 8. Guezic A, Wu K, Kalvin A, Williamson B, Kazanzides P, Van Vorhis R. Providing visual information to validate 2-D to 3-D registration. Med Image Anal 2000;4:357-74. 9. DiGioia AM, Jaramaz B, Blackwell M, Simon DA, Morgan F, Moody JE, Nikou C, Colgan BD, Aston CA, Labarca RS, Kischell E, Kanade T. The Otto Aufranc Award. Image guided navigation system to measure intraoperatively acetabular implant alignment. Clin Orthop 1998;355: 8-22. 10. Yao J, Taylor RH, Goldberg RP,Kumar R, Bzostek A, Van Vorhis R, Kazanzides P, Gueziec A. A C-arm fluoroscopy guided progressive cut refinement strategy using a surgical robot. Comput Aided Surg 2000; 5:373-390. 11. Nolte LP, Slomoczykowski MA, Berlemann U, Strauss MJ, Hofstetter R, Schlenzka D, Laine T, Lund T. A new approach to computer-aided spine
192
12. 13. 14.
15.
16.
17.
18.
19.
20.
21.
surgery: fluoroscopic based surgical navigation. Eur Spine J 2000; 9 (Suppl 11):78-88. Gautier E, Bachler R, Heini PF, Nolte LP. Accuracy of computer-guided screw fixation of the sacroiliac joint. Clin Orthop 2001;393:3 10-7. Loder RT. Unstable slipped capital femoral epiphysis. J Pediatr Orthop. 200 1;2 1 :694-9. Schnieders MJ, Dave SB, Morrow DE, Heiner AD, Pedersen DR, Brown TD. Assessing the accuracy of a prototype drill guide for fibular graft placement in femoral head necrosis. Iowa Orthop J 1997; 17:58-63. Hinsche AF, Giannoudis PV, Smith RM. Fluoroscopy-based multiplanar image guidance for insertion of sacroiliac screws. Clin Orthop 2002; 395:135-44. Nolte LP, Zamorano LJ, Jiang Z, Wang Q , Langlotz F, Berlemann U. Image-guided insertion of transpedicular screws. A laboratory set-up. Spine 1995; 201497-500. Merloz P, Tonetti J, Eid A, Faure C, Lavallee S, Troccaz J, Sautot P, Hamadeh A, Cinquin P. Computer assisted spine surgery. Clin Orthop 1997; 337:86-96. Browbank I, Bouazza-Marouf K, Schnabler J. Robotic assisted internal fixation of hip fractures: a fluoroscopy-based intraoperative registration technique. J Eng Med 2000; 2 14:165-179. Ohnsorge JAK, Schkommodau E, Wirtz DC, Wildberger JE, Prescher, A, Siebert CH. Accuracy of fluoroscopically navigated drilling procedures at the hip. Z Orthop 2003; 141 (1): 112-9. Ohnsorge, JAK., Portheine F, Mahnken AH, Prescher A, Wirtz DC, Siebert CH. Computer-assisted retrograde drilling of osteochondritic lesions of the talus with the help of fluoroscopic navigation. Z Orthop 2003; 141 (4): 452458. Ohnsorge JAK, de la Fuente M, Jetzki S, Radermacher K, Wirtz DC. Intraoperative 3D reconstruction of the PMMA plug for computer-assisted revision of total hip arthroplasty based on 2D X-ray images. Z Orthop 2003; 141 (5):531-539.
ACCURACY OF FLUOROSCOPE AND NAVIGATED CONTROLLED HIND- AND MIDFOOT CORRECTION OF DEFORMITIES - A FEASIBILITY STUDY JENS GEERLING, STEFAN ZECH, DANIEL KENDOFF, TOBIAS HUFNER, MARTINUS RICHTER, CHRISTIAN KRETTEK Trauma Department, Hannover Medical School (MHH), Hanno ver, 30625, Germany
The accuracy of the intraoperativc correction of the Correction operation of hind and midfoot deformities correlates with the clinical result. However, thc intraoperative correction with fluoroscopy control is challenging. The aim of the study was to compare CT-based navigation correction operation of hind- and midfoot dcformitics with fluoroscopy controlled correction. Plastic models with defined deformities were used. The aim of the correction was to transform the shape of the specimen with deformity to the shape of a normal plastic bone model. Two methods were used: I . conventional Carm, 2 . navigated CT-based correction (Surgigate, Medivision, Switzerland). The visualization was provided by the image only, the direct view of the surgcon was disablcd by drapes. Following parameters were registered: 1. Time for procedure, 2 . fluoroscopy time, 3. difference between corrected specimen and normal model. The accuracy of the correction operation with computer assisted navigation was higher than using conventional fluoroscopy in this experimental setting. In conclusion CAS might be a valuable tool in correction operation in foot surgery.
1.
Introduction
The accuracy of the intraoperative correction of the correction operation correlates with the clinical result. The precision of the intraoperative correction with fluoroscopy control is challenging. In other body regions CAS has shown to be a valuable tool for correction operation and may also be helpful in correction operations of foot deformities. The aim of the study was the comparison of CT-based navigation correction operation of hind- and midfoot deformities with fluoroscopy controlled correction. 2.
Material and Methods
Plastic models with defined deformities (Sawbone, Pacific Research Laboratories, Vashon, WA, USA) were used. The aim of the correction was to transform the shape of the specimen with deformity to the the shape of a normal plactic bone model (Fig. 1,2).
193
194
For accuracy analysis two methods were used: 1 . conventional C-arm based correction, 2. navigated CT-based correction operation (Surgigate, Medivision, Oberdorf, Switzerland) . Of each deformity (Equinus deformity, Calcaneus Malunion, Equinovarus Deformity) fife models were corrected with each method. Standardized osteotomies were performed before the correction when necessary (calcaneus malunion and eqinovarus).
Figure 1: The Sawbone model of a normal foot. The shape of this model should be achieved during correction.
195
Figure 2: The model of a equinus deformity. Using standardised osteotomies the shape of the model with no deformity should be achieved.
The only visualization was provided by the image of the fluoroscope (Fig. 3) or the navigation system, the direct view of the surgeon was disabled by drapes (Fig. 4). The retention was performed using K-wires. The following parameters were registered: 1. Time for entire procedure, 2. fluroscopy time, 3. difference between corrected specimen and normal model in foot length, length and height of longitudinal arch, calcaneus inclination, hindfoot angle for all models (n=30) and additional Boehlers angle and calcaneus length for the Calcaneus Malunion models (n=lO). The shape of the corrected models were graded in normal, nearly normal, abnormal and severe abnormal.
196
Figure 3: During fluoroscopy based correction the direct view to the model was disabled by drapes. The only visualisation was provided by the monitor.
3.
Results
The shape were graded normal in all specimens of the CAS group (n=15) and in eight models in the fluoroscopy group. The other grades in the C-Arm group were six nearly normal and one abnormal; p=0,05, Chi2-test) The other parameters were compared using the t-test. Parameter Time of procedure Fluoroscopy time Foot length length longit. arch height longit. arch calcaneus inclination calcaneus length Boehler's angle
CAOS 782 0 -l,7±l,9mm -0,9±0,9 mm -0,1±0,5 mm 0,1±1,4°
Fluoroscopy 410 11 -4,l±3,8mm -5,6±4,9 mm l,7±4,3mm 2,7±4,8°
Significance p<0.001 pO.OOl p=0.03 p= 0.001 p=0.14 p=0.05
-0,5±0.4° 0,4±1,1°
-2,8±1,3° 4,1±8,6°
p=0.005 p=0.37
197
Figure 4: During navigated correction the only visualisation was provided by the navigation monitor. The optoelectronical camera was placed to the opposite side.
4.
Conclusion
The accuracy of the correction operation of hind- and midfoot deformities with computer assisted navigation was higher than using conventional fluoroscopy in this experimental setting. No additional intraoperative fluoroscopy was needed using the CT-based navigation compared with 11 seconds in average using fluoroscopy controlled correction. However, the time for the whole procedure was almost twice in the CAOS Group. CAS allowing higher accuracy of the correction and reduction and might be a valuable tool in correction operation in foot surgery. Clinical studies have to show if this higher accuracy will be achieved in real operations as well and if this leads to better clinical results.
NAVIGATION ERROR OF UPDATE NAVIGATION SYSTEM BASED ON INTRAOPERATIVE MRI YOSHIHIRO MURAGAKI], HIROSHI ISEKI', MADOKA SUGIURA', KOUICHI SUZUKAWA', KYOJIRO NAMBU', KINTOMO TAKAKURA' Faculty of Advanced Techno-Surgety(FA TS), Institute of Advanced Biomedical Engineering & Science ,Graduate School of Medicine, Tokyo Women's Medical University, 8-I Kawada-cho, Shinjuku-ku, Tokyo, 162-8466, Japan' TAKASHI MARUYAMA~,OSAMI KUBO~,TOMOKATSU HORI~ Department of Neurosurgery. Tokyo Women's Medical Universiy'
A navigation system based on an intra-operative magnetic resonance imaging (iMRI) is more accurate system compared to a conventional navigation system based on the preoperative images, because an error caused by the intra-operative brain deformation such as a brain shift can be eliminated in the new system. We here present the study about the accuracy of the updated navigation (UN) based on the iMRI. The UN system was installed in the navigation system for the intra-operative irradiation (PRS navigator). The accuracy was estimated retrospectively in 47 cases (94 times) by the assessment of the errors with least-squares method. The maximum errors were more than 5 mm in three cases whose iMR images were distorted. Navigational errors were also compared between two types of the developed fiducial markers. The UN by the new markers whose markers for iMRI and markers for the registration were different showed significantly less errors (maximum 2.36 mm, mean 1.16 mm) compared to the UN by the previous markers (maximum 4.65 mm, mean 1.42 mm). In conclusion, the accuracy of the UN depends not only on the internal error, but also on the image quality including the distortion and the fiducial markers.
1.
Introduction
After Kato et d.') and Watanabe et aL2)developed a navigation system that displays an area of real surgical field on computed tomography (CT) or MRI screen. In 1990s various advanced neurosurgical navigation systems became available. The methods of three dimensional localization includes types operated by mechanical arm2), infrared radiation3'), or magnetism'). The accuracy of navigation depends on intrinsic technical accuracy of the system, the quality of images, and registration procedures3). In addition these conventional navigation systems had common problems of navigation error due to various intraoperative brain deformation and shift since the navigation was based on preoperative images. As one of the various navigation system problems, interests were recently focused on a well-known phenomenon called brain shift. It is a phenomenon of brain displacement induced by intraoperative 198
199
manipulations and by gravity. Nimsky et 0 1 . ~ )reported a mean brain sift of 8.4 mm at the cortical region and that of 4.4 mm at the deeper area, and these values were affected by the size of craniotomy opening, the tissue type of tumor, patient's position, whether the ventricle was opened or not, and volume of tumor to be removed. The essential method to overcome the brain shift problem is to update the navigation information intraoperatively by means of CT, MRI", andor echo". We developed an update navigation system that updates photon radiation system (PRS) navigator of a conventional frameless navigation by infrared radiation using intraoperative MRI. Although the update navigation could minimize the navigational error, error from other source still remains. In this study, we here estimated the navigation error of the update navigation by intraoperative MR17'8),focusing on the image quality and the fiducial markers.
2.
Materials and Methods
An MRI scanner (AIRIS IITM,Hitachi Medical Corporation, Tokyo, Japan) used in this study was a permanent magnet, hamburger-shaped one with magnetic field strength of 0.3 Tesla and a gantry of 43 cm in width". In this study MR images were obtained at 3-mm slice thickness (1.5-mm slice intervals, 100 slices) under following conditions: field of view (FOV), 230 X 230 mm; TR, 27 msec; TE, 10 msec; for T1-weighted spin echo and FOV, 230x230 mm; TR, 3000 msec; TE, 120 msec; for T2-weighted turbo. An MRI contrast agent (10ml of gadolinium) was administered intravenously only when a tumor mass was enhanced by the agents in the preoperative MRI. For head fixation we used slightly modified Sugita's four-point head holders that were made of glass fiber instead of original steal material. To incorporate MRI-coil function a new aluminum alloy, four-point head holder was innovated and used since July 2001. When the MRI scanner was in use a cover-coil of cupper plate coated with resin was connected as a receiving coil (Head Holder Coil, HHC, Figure 1B). The navigator (PRS navigator, Toshiba Medical, Tokyo, Japan, Figure 1A) is based on a conventional infrared location-identification device, which shows the location of the suction tip and the direction of the suction tube on a threeplane display"). For navigation TI-, T2-, and Gadolinium-enhanced, T 1weighted MR images were taken at 3-mm slice thickness and obtained DICOM format files were transferred to a computer for the PRS navigator through a local area network. More than four fiducial markers were implanted into the skull and used for registration.
200
Figure 1. PRS Navigator and various fiducial markers. A PRS navigator. B Head holder coil which was designed both for the head fixation and the iMR1 scanning. C Fiducial marker type 1. A marker (left) and titanium base (right). D Fiducial marker type 2. A marker for MR imaging (left) is replaced to a marker for the registration (right). First 3 normalized frequencies versus release location for clamped simply supported beam with internal slide release.
We used two types of the fiducial markers in this study. Type-I marker is used for both MRI and registration. Type-2 marker is a marker set consisted of two different markers. Spherical-shaped markers were used for MRI (Figure 2A) and flat-board-shaped markers with a hole coincided with the center of the spherical-shaped marker were used for registration (Figures 2B). Intra-operative navigational errors were assessed using the least-squares method. MR images taken pre- and postoperatively were used to evaluate a tumor volume and resection rate. A tumor area was computed for each slice and multiplied by the slice interval (1.5 mm), and then the resulting value was multiplied by the number of slices. The formula for the calculation of tumor volume (ml) and resection rate (%) was as follows:
where TV is tumor volume, TAk is tumor area of each slice estimated by the software of the AIRIS IITM, n is number of slices used, RR is resection rate, TVpre is the TV before resection, and TVpost is the TV after resection.
201
Figure 2. Fiducial marker type 2 for update navigation(UN) A Markers for MR imaging. Markers contains the material that both TI and T2 images are positive. B Markers for registration. The points that the pointer indicates (right) are the center of the marker for MR imaging ( center of the green ball, left).
We evaluated the medium and maximum errors of two types of markers and resection rates in 53 cases undergone a tumor-resection surgery with iMRI from July 2000 to February 2002.
3.
Results
The MR image quality obtained using the HHC was practically good enough to distinguish the lesion from the normal tissue both in T1 and T2 weighted images in 50 cases (94%).
Figure 3. Intraoperative MRI A Intraoperative MRI after tumor resection. The glioma adjacent to the eloquent area (speech area) was totally resected(arrow: fiducial marker). B Intraoperative MRI before tumor resection. The image was unclear and distorted because the receive coil was too close to the MRI gantry. Fidicial marker type 2 for update navigation(UN)
202 However, quality of three cases (6%) was unacceptable because of image distortion from inadequate coil position in two cases and artifacts from alternating current in one case. Second iMR image was improved in all three cases. We performed the update navigation 94 times in 53 cases (mean 1.8 times per case). The type- 1 marker was used 47 times and the type-2 marker was used 47 times. The mean error was more than 5 mm in three times; One by image distortion, two by type-1 marker. The mean error was less than 3 mm in 91 times. The analysis of the mean errors demonstrated significant differences between the type- 1 marker (1.42 f 0.59 mm) and the type-2 marker (1.15 f 0.37 mm). With respect to the mean maximum error, there was also significant difference between the type-1 marker (2.68 f 0.90 mm; maximum 4.65 mm) and the type-2 marker (2.24 f 0.87 mm; maximum 2.36 mm). When compared to the resection rate of the surgery using the type-I marker (89%), the resection rate of the surgery using the type-2 marker (93%) was better but not significant (Table 1). Table 1 . Navigation errors and resection rates using two different fiducial markers. Type- 1 marker
Type-2 marker
cases (times) mean error (mm)
resection rate (%)
4.
Discussions
The accuracy of navigation depends on the intrinsic technical accuracy of the system, the quality of images, and registration procedures. This study demonstrated two factors such as image quality and fiducial markers influence the accuracy of navigation updated by intraoperative MRI. Intraoperative MRI has very difficult imaging condition compared to standard diagnostic MRI. To improve the image quality, prevention of not only the image artifact by the alternating current from various operating equipment, but also the image distortion is necessary. The reason of the distortion causing prominent error was the head position in the gantry in our system. The error estimation showing prominent error suggested us to change the head position to the center of the gantry.
203 The error of the type-1 marker was significantly larger than that of the type2 one indicating that we should change the markers at registration. Type-1 marker that surgeons do not have to change markers is more convenient than type-2, however, registration error was significantly larger because we couldn't point the center of the marker at registration. We are developing new markers with less error. 5. Conclusion The accuracy of the UN depends not only on the internal error, but also on the image quality including the MR image distortion and the fiducial markers. References
1. Kato A, Yoshimine T, Hayakawa T, et al: A frameless, armless navigational system for computer-assisted neurosurgery. Technical note. J Neurosurg 74:845-849, 1991 2. Watanabe E, Mayanagi Y, Kosugi Y, et al: Open surgery assisted by the neuronavigator, a stereotactic, articulated, sensitive arm. Neurosurgery 28:792-799, 1991 3. Maciunas RJ, Galloway RL, Latimer JM: The application accuracy of stereotactic frames. Neurosurgery 35:682-695, 1994 4. Zamorano LJ, Nolte L, Kadi AM, et al: Interactive intraoperative localization using an infrared-based system. Neurol Res 15:290-298, 1993 5. Nimsky C, Ganslandt 0, Cerny S, et al: Quantification of, visualization, and compensation for brain shift using intraoperative magnetic resonance imaging. Neurosurgery 47: 1070-1080,2000 6. Hata N, Dohi T, Iseki H, et al: Development of a frameless and armless stereotactic neuronavigation system with ultrasonographic registration. Neurosurgery 41:608-613, 1997 7. Black PM, Moriarty T, Alexander E, et al: Development and implementation of intraoperative magnetic resonance imaging and its neurosurgical applications. Neurosurgery 41:83 1-842, 1997 8. Wirtz CR, Bonsanto MM, Knauth M, et al: Intraoperative magnetic resonance imaging to update interactive navigation in neurosurgery: method and preliminary experience. Computer Aided Surgery 2: 172-179, 1997 9. Iseki H, Muragaki Y, Taira T, et al: New Possibilities for Stereotaxis. information-guided stereotaxis. Stereotuct Funct Neurosurg 76:159- 167., 200 1 10. Muragaki Y, Hashizume M: Recent advances in surgical technology - open MRI and "real-time" navigation. Fukuoku lguku Zusshi. 93( 11):223-30, 2002
ACCURACY ANALYSIS OF VESSEL SEGMENTATION FOR A LITT DOSIMETRY PLANNING SYSTEM*
J. DREXL’, V. KNAPPE2, H. K. HAHN1, K. S. LEHA4ANN3, B. FRERICKS3, H. SHIN4, H.-0. PEITGEN’ 1. MeVis - Center f o r Medical Diagnostic Systems and Visualization, Universitaetsallee 29, 0-28359 Bremen, Germany
Email:
[email protected] 2. Laser- und Medizin- Technologie GmbH, Berlin 3. Charite - Campus Benjamin Franklin, Berlin 4. Medical School Hannover, Hannover
Recently, we reported on a system integrating segmentation of intrahepatic structures such as vessels and tumors from helical CT scans with dose calculation for laser induced thermotherapy (LITT) t o address the problem of cooling effects induced by blood flow. The purpose of the present work is to evaluate our system’s reliability in calculating a thermolesion due t o a LITT intervention. In particular, we study to what extent segmentation errors will affect the volume of a thermolesion. We do this by utilizing a physical phantom as well as a set of digital phantoms, emulating idealized “vessels” placed in a background of textured “liver parenchyma”.
1. Introduction
’
Simulating laser-induced thermal tissue (LITT) reactions is a complex task, which requires the combination of different physical processes while considering various parameters. A dosimetry model (LITCIT for Windows”) was developed for the thermal laser treatment of biological tissue and applied to laser-induced thermotherapy of organ tumors such as liver metastases.” However, for a clinically relevant simulation the vascular situation must be taken into consideration. This point is very important because it is known from larger vessels that the blood flow acts as a cooling source and consequently the risk of insufficiently treated regions can not *This work is supported by the DFG. aregistered trademark of LMTB GmbH, Berlin
204
205 be neglected. These regions are potential sources of tumor recurrence and therefore their occurence needs to be prevented in any case. To improve this situation, we integrated fast helical multi-sclice C T imaging, image segmentation, and radiation The segmentation part is handled by our software tool Hepavision, which allows for the segmentation of intrahepatic structures, i.e., parenchyma, vessels and tumors. First qualitative results of our integrated planning system were presented by Lehmann et a1.6 The purpose of the present work is to extend these results by quantitative data on the performance of our system, which we derived from measurements on a physical phantom and a digital phantom. 2. The physical phantom We generated CT Scans of a physical phantom designed at the Medical School Hannover. Our aim was to emulate the venous phase of the contrast agent delivery. We want to examine the effects of partial volume effects, discretization artifacts and noise level on the segmentation process.
Figure 1.
Photography of physical phantom.
The phantom is made from Plexiglas (polymethyl methacrylate, diameter 22 cm, gauge 3.2 cm) with cylindrical bore holes of a defined diameter (1.0 mm, 2.0 mm, 3.0 mm, 5.0 mm, and 10 mm diameter). Plexiglas is a very homogeneous material with HU values in the desired range for our study. As for the vessel diameters for our study, it has been reported5 that the diameter of the portal vein stem is 10.9 f2.1 mm. For the lower bound,
206 vessels with diameter smaller than 3 mm are assumed to have a negligible indivual cooling effect. We prepared a solution of the iodine contrast agent Ultravist (Schering AG, Berlin) in Ringer’s solution in a defined concentration (1:50). The bore holes of the phantom were filled up with the contrast agent to create the effect of contrast agent filled vessels embedded in contrast agent enriched liver tissue. The so prepared phantom was then scanned using a Siemens Somatom Sensation 16 C T Scanner (Siemens AG, Erlangen), a state of the art helical CT scanner. We used the standard protocol for liver CT scans with a standard abdomen field of view. Before scanning, the phantom was positioned orthogonal to the table-feed direction using the laser positioning device of the CT. The scan was reconstructed using a standard abdomen reconstruction kernel (b20f) a t resolution 0.74222 x 0.8 mm3. The statistics of the phantom are shown in Table 1. These values correspond well with the x-ray attenuation coefficients of liver parenchyma and portal vein in venous phase. Table 1. Statistics of phantom scan.
1:50,B20f
Plexiglas:
contrast agent filled bore holes:
124.27 HU f 13.33 HU
182.39 HU 4~13.04 HU
3. The digital phantom It is possible to model the Somatom Sensation 16 scanning process from first physical principles. However, due to the great level of detail information necessary for such a task not available for this study, we went for a generic system-level approach. Our imaging situation is described by the following facts:
0
We are not in a dedicated low-dose situation, so a Gaussian approximation for the Poisson distributed quantum noise holds. The scenery is of a low contrast kind, so the difference in relative attenuation between background and foreground is small (see Table 1). As the difference in relative attentuation between background and foreground is small, background and foreground have the same noise level (see Table 1).
207 0
We look only a t small patches (95 mm2 in xy plane) centered at the middle of the phantom, so we do not have t o take into account variations of noise level due to its dependence on point location.
Furthermore, we assume monochromaticity of the x-ray radiation a t an effective energy E,R. In this situation, it is appropriate to invoke the so called “first order linear equation for volume averaging” (see Beutel et aL2, p.535), and to ignore all nonlinear effects.
3.1. Our model for the Helical CT scanning process The Hounsfield values of the CT scan are x-ray attenuation coefficients p , scaled relative to the x-ray attenuation coefficient of water p H 2 0 : HU
=
1000I-L - pnzo pn20
(1)
In our situation, due to the first order approximation for volume averaging we have a mixture between the attentuation values of foreground voxels (”vessel voxels” ) and background voxels (”parenchyma voxels” ) p2: P = XPl
+ (1
-
X)P2,
E 10; 11
(2)
In a linearized setting, the superposition property also holds for the noise in the scan. This leads us to consider the model of a deterministic signal f superimposed on a stochastic noise process m:
.f=f+m
(3)
3.2. Modeling the deterministic signal f
We use the model outlined by Hahn.3 A binary valued object function o(z, y, z ) is convolved with the C T scanner’s 3D point spread function
h(x,Y,z ) :
f(z,Y,).
=
h ( z ,Y,
* 45,Y,2)
(4)
Together with the discretization of f on the sampling grid of the scanner this models object blurring and the partial volume effect. Note that the convolution in the above equation is a continuous convolution. To generate signals on a digital computer, we use a supersampling technique for a digital approximation of the convolution integral (supersampling with 0.2 mm).
208 We approximate the point spread function h by the tensorization of three Gaussian functions:
with
This choice is motivated by the fact that in abdominal CT scanning, fairly %oft” reconstruction kernels are routinely used, which are well described by bell-shaped functions. We estimated the oi from sharp edges in the phantom scan. In our study, we used ui = mm for all directions, and we used cylinders of length 3.2 cm as object functions o(z, y, z ) .
5
3.3. Modeling the noise signal m
In conventional 2D CT, image noise is described as non-white Gaussian noise with zero mean. This noise is the cause for the textured look of CT images. This stochastic process4 is fully described by its 2D power spectrum Nrn(ux,u y ) . The most simple non-trivial model is a radial symetric modelg which contains no directional terms:
where u denotes spatial frequency in cycles/cm. Using the FFT algorithm2, we estimated the 2D power spectrum from a 128x128~30voxels patch from the phantom containing only Plexiglas. We then computed Nrn(ur)by averaging over annuli with constant radius in the 2D power s p e ~ t r u m . ~ From a visual inspection of the Nrn(ur)plot, we chose a power spectrum derived from a generalized Hamming weighted ramp filter
with
cy := 0.52
(Av denotes bandwidth and b is a normalization constant).
Extension to 3 0 helical scanning: Helical scanning is an extension of conventional 2D scanning by incorporating the idea of a continuous table feed. This leads to a blur in z-direction,
209 which is modeled by h,(z). Incorporating the z-direction blur in the 2D noise model leads us to model N,(v,, vy,v,) as N771(vZ7
vy7
’2)
= N,(vZ,
vy)IHZ(vZ)12
(9)
with H, as the Fourier transform of h,. We synthesize textures by filtering Gaussian white noise with a texture synthesis filter g, whose Fourier transform is derived from N , by
G(vZ,vy, v,) = N,(G, vy,vz)+
(10)
The filtering is carried out in Fourier space with the help of the FFT algorithm.
3.4. Evaluation of the digital phantom’s quality Fig. 2 shows a 128x128 pixels patch of the original Somatom scan and a patch of the digitally synthesized phantom with an artificial “vessel” in it. One observes the good degree of realism exhibited by the digital phantom.b
Figure 2.
128x128 pixels patches: Somatom (left) and synthetic (right).
4. Results
With the above proposed method, we generated two sets of digital phantoms. Each image had exactly one cylinder in it (for diameters 3 mm, 5 mm and 10 mm and length 32 mm): bThe little bright spot in Fig. 2a is a very tiny calcification and not a contrast agent filled bore hole. As there is no need to emulate such structures, no analogue is present in the synthetic image Fig. 2b.
210
Group A: 3 x 8 images, with different realizations of the noise process, but all cylinders with the same orientation. Group B: 3 x 8 cylinders with random orientation, but all with the same realization of background noise. We used the digital phantoms to evaluate the ability of Hepavision to accuratly segment vessels in a structured background, and to examine possibly error amplifying effects of further numerical calculations with LITCIT on these segmentations. As a control experiment for the realism of our digital phantom, we also segmented the 10 mm and 5 mm bore holes of the physical phantom. All Hepavision segmentation was done with the same default parameters for the Hepavision vascular preprocessing step and a fixed segmentation threshold 0 = 20. 4.1. Evaluating the Hepa Vision vessel segmentation
4.1.1. Results on Group A and Group B
We used the known volume for a cylinder of known length and diameter as the gold standard for the calculations. We calculated the mean segmented volume of the digital phantoms of a given diameter. We give the errors relative to the gold standard, and calculate the standard deviation of the relative errors. The results for group A are shown in Table 2, for group B in Table 3. One can see that there is little variation between the two groups. Table 2. Hepavision vessel segmentation on digital phantom: group A, 24 images Diameter
true volume
mean segmented
mean relative
[mml
[mm31
volume [mm3]
error
10
2513.27
2133.4
15.1 %
5
628.32
573.45
8.7 % f 2.59 %
3
226.19
181.62
19.7 % f 6.85 %
zt
0.72 %
21 1 Table 3. Hepavision vessel segmentation on digital phantom: group B, 24 images Diameter
true volume
mean segmented
mean relative error
[mml
[mm31
volume [mm3]
10
2513.27
2181.6
13.2 % 3z 1.90 %
5
628.32
599.06
4.7 % f 2.45 %
3
226.19
172.14
23.8 % & 12.81 %
4.1.2. Results o n physical phantom, and cornparision to the results o n the digital phantoms We segmented the four biggest bore holes of the physical phantom with the HepaVison vessel segmentation and computed the relative error to the gold standard volumes. From the results in Table 4 one sees that the relative errors of the physical phantom’s bore holes are of the magnitude of the relative errors of the digital phantom, so our digital phantom behaves very similar to its physical counterpart. Table 4.
Hepavision vessel segmentation on physical phantom.
Bore hole
true volume [mm3]
segmented volume [mm3]
relative error
Nr. 1, 10 mm
2513.27
2122.3
15.5 %
Nr. 2, 10 m m
2513.27
1957.9
22.1 %
Nr. 3, 5 m m
628.32
615.62
2.0 %
Nr. 4, 5 mm
628.32
592.71
5.7 %
4.2. Evaluating LITCIT dose calculation on Hepa Vision
vessel segmentation results We studied the ability of our planning system to predict the damaged volume of a LITT intervention when taking vessel perfusion into account. For this evaluation, we resampled the segmentation results to an isotropic voxel size of 0.8 mm. We took the exact known geometry as the gold standard and compared them to the LITCIT calculations computed on the HepaVision segmentation results. We only give here the damaged volume. The flows were varied according to the formula
212
taken from Hahn3, where F denotes blood flow in ml/min, R denotes the radius of a vessel and C a proportionality constant. We chose C so that we have F = 10 ml/min for a diameter of 4 mm. R was considered as a priori known in this study. For the diameters 10 mm and 3 mm, we have only one image in the evaluation. The results are shown in Table 5. Table 5. Calculation of damaged tissue on Hepavision segmentation of digital phantom. Diameter
flow
lmml
[ml/min]
nr. of images
true volume
volume
relative
[cm31
[em3]
error
10
187.7
1
10.481
10.074
3.9 %
5
20.4
8
11.567
11.117
3.9 % & 0.3 %
3.98
1
12.076
11.852
1.9 %
3
5 . Discussion
Our results lead us to the following conclusions: 0
0
0
The validation using a digital phantom can substitute the validation using the more expensive and difficult to handle physical phantom. While the Hepavision vessel segmentation has been proven to accuratly extract the vessel skeletons", it shows a systematic underestimation in determining the vessel volumes. This deviation can be attributed to the partial volume effect. Error propagation of segmentation errors into the calculation of the damaged volume: calculating the damaged volume from potentially erroneous segmentation results does not seem to amplify segmentation errors.
For our future work we plan to extend the present results by creating digital phantoms not only for small patches and cylindrical objects, but also for more complicated vessel geometries. To assess our systems robustness towards variation in the quality of the input C T scans, we want to generate digital phantoms with different noise levels, different reconstruction kernels and varying degrees of anisotropy. A more reliable segmentation of the vessels, eliminating user interaction through choice of a segmentation threshold, and explicitly accounting for partial volume effects, should us allow to decrease the error in the calculation of the damaged volume to
213 under 1%.Finally, our goal is to estimate t h e blood flow F from the vessel radius R taken from the segmentation results, and t o validate this strategy. We will also include results and d a t a from in-vivo measurements on a pig animal model.
Acknowledgments T h e authors thank Dr. Guido Prause for reviewing the manuscript.
References 1. D. Albrecht, C.T. Germer, C. Isbert, J.P. Ritz and H.J. Buhr, Endoslcopie heute 10, 145 (1997). 2. J. Beutel, H.L. Kundel and R.L. Van Metter, “Handbook of Medical Imaging”, SPIE Press, Bellingham (2000). 3. H.K. Hahn, “Macroscopic Morphometrie and Dynamics of Hepatic Vessels in Humans”, diploma thesis, 1999/Report Nr.12, MeVis Center for Medical Diagnostic Systems and Visualization (in German) 4. K.M. Hanson, Med. Phys. 6, 441 (1979) 5. B. Heidgen, “Segmentorientierte Portale Perfusion: Verzweigungstypen im Computertomogramm und Flussmessung mit Dopplersonographie” , PhillipsUniv. Marburg, 1naug.-Diss. (1995). 6. K.S. Lehmann, J.P. Ritz, J. Drexl, B. F’rericks, H. Bourquain, A. Schenk, C.T. Germer, H.O. Peitgen and H.J. Buhr, “F’usionierung von 3D-Bildgebung der Leber mit einem Planungs-System fur die in-situ-Ablation maligner Lebertumoren”, presented on CURAC 2003, Niirnberg, 4.-7.11.2003 7. A. Littmann, A. Schenk, B. Preim, K.S. Lehmann, J. Ritz, C. Germer, A. Roggan and H.O. Peitgen, “Kombination von Bildanalyse und physikalischer Simultaion fiir die Planung von Behandlungen maligner Lebertumore mittels laserinduzierter Thermotherapie”, Bildverarbeitung fur die Medizin 428, Springer, Berlin (2003) 8. A. Littmann, A. Schenk,B. Preim, G. Prause, K.S. Lehmann, A. Roggan and H.O. Peitgen, “Planning of Anatomical Resections and In-situ Ablations in Oncologic Liver Surgery”, Computer Assisted Radiology and Surgery 684, Elsevier, Amsterdam (2003) 9. S.J.Riederer, N.J. Pelc and D.A. Cheder, Phys. Med. Biol. 23,446, (1978) 10. A. Roggan, J.P. Ritz, V. Knappe, C.T. Germer, C. Isbert, D. Schadel and G. Muller, Med. Laser Appl. 16,65 (2001) 11. D. Selle, B. Preim, A. Schenk and H.O. Peitgen, I E E E Trans Med Imaging 21, 1344 (2002)
This page intentionally left blank
Clinical Case Studies
This page intentionally left blank
RESECTION OF BONY TUMORS WITHIN THE PELVIS HEMIPELVECTOMY USING NAVIGATION AND IMPLANTATION OF A NEW, CUSTOM MADE PROSTHESIS E N S GEERLING, DANIEL KENDOFF, LEONHARD BASTIAN, ECKEHARD MOSSINGER, MARTINUS RICHTER, TOBIAS m F N E R ; CHRISTIAN KRETTEK Trauma Department, Hannover Medical School ( M H ) ) ,Hannover, 30625, Germany The treatment for malignant bony tumors within the pelvis includcs largc rcscctions. This rcpresents a complex problem in clinical practice duc to important organs and structures. Computer assistcd surgery (CAS) bascd on CT data could increase the accuracy of tumor resections. At a paticnt a chondrosarcoma was diagnosed appearing as a 7 x 7 ~ 5cm tumor within the left acctabulum. The femur has to be devided from the body because the of the tumor. To till the gab of the resected bone a custom made prosthesis was build. Based on CT-datasct a plastic model of the pelvis was produccd and the resection was planned. The prosthesis was produced based on thc fragments dimcnsion. With a navigation system (Medivision, Switzerland) an exact intraopcrativc rcscction was pcrformcd. Thc implantation was performed 10 days aficr rcscction demonstrating an cxact fit of thc prosthesis.
1.
Introduction
Treatment for malignant bony tumors includes large resections [3, 41. This represents a complex problem in clinical practice due to important organs and structures. The biomechanical function transferring load from the spine to the lower extremities should be restored. Computer assisted surgery (CAS) based on CT data could increase the accuracy of tumor resections. Currently, commercially available navigation systems use the precise preoperative CT data allowing navigated pedicle screw applications and cup placement for hip arthroplasty [ 1, 21. Further applications have been reported within the pelvis as the CAS supported posttraumatic pelvic osteotomies and the placement of sacroiliac screws or long lag screws as required in the osteosynthesis of acetabular fractures.
2.
Clinical case
At a 54 year old active patient a chondrosarcoma was diagnosed. It appears as a 7 x 7 ~ 5cm tumor in the region of the left acetabulum with huge soft tissue proportion (Fig. 1,2). Due to the necessary resection, the only treatment option, the femur will be devided from the body because the femoral head has to be removed as well. 217
218
Figure 1: Preoperative CT showing the tumor within the left acetabulum. Furtermore the huge soft tissue portion is visible.
Figure 2: Within the coronar slice the extrusion of the pelvic organs is visible.
219
3.
Solution
The question of filling the gab of the resected bone was solved with a custom made prosthesis. Based on CT-dataset a plastic model of the patients pelvis was produced. The resection lines were planned and the fragment with the tumor was resected (Fig. 3). A custom made prosthesis was produced, including a cup for a hip prosthesis based on the fragments dimension (Fig. 4). The prosthesis was build of cpTi with a coated surface of bonit and antibiotics.
Figure 3: The plastic model of the patients pelvis with the planned resection of the left acetabulum.
Figure 4 The custom made prosthesis build on the fragments dimension showed a good fit into the model.
220 Due to this procedure a prosthesis was build fitting exactly into the resected plastic pelvis model. With a navigation system (Medivision, Switzerland) an exact intraoperative resection was performed. A CT of the pelvis model was performed. Due to the manufacturing method the model is 3% bigger than the patients pelvis. The dataset were calculated to 100% for the navigation. The resection was performed using navigated chisels in combination with the module “spine” (Fig. 5, 6).
Figure 5: The intraoperative situs with a difficult orientation. The navigated chisel can be seen.
221
Figure 6 : On the navigation monitor the orientation of the chisel can be directed within the threedimensional dataset. For navigation a CT of the model was used for precise resection within the planned resection lines.
The implantation of the prosthesis was done 10 days after resection of the tumor demonstrating an exact fit of the prosthesis into the resection lines (Fig. 7). Pathological investigation showed tumor free margins.
4.
Conclusion
The challenging operation could be done with a satisfying result for the patient combining modem technologies like computer based production of a prosthesis within the pelvis and the innovative resection using computer assisted surgery.
222
Figure 7: During the implantation of the prosthesis an exact fit into the pelvis was achieved due to the navigated resection.
References 1. 2. 3.
4.
Berlemann, U. et al. [Computer-assisted orthopedic surgery. From pedicle screw insertion to further applications]. Orthopade, 1997.26(5): p. 463-9. Jaramaz, B. et al., Computer assisted measurement of cup placement in total hip replacement. Clin Orthop, 1998(354):p. 70-81. Phang, P.T. et al., Effects of positive resection margin and tumor distance from anus on rectal cancer treatment outcomes. Am J Surg, 2002. 183(5): p. 504-8. York, J.E., et al., Sacral chordoma: 40-year experience at a major cancer center. Neurosurgery, 1999.44(1): p. 74-9,
F-MRI INTEGRATED NEURONAVIGATION - LESION PROXIMITY TO ELOQUENT CORTEX AS PREDICTOR FOR MOTOR DEFICIT RENE KRISHNAN Department of Neurosurgery, Johann Worfgang Goethe-University, Schleusenweg 2-16, 60528 Frankj%rt/Main, Germany HILAL YAHYA, ANDREA SZELENYI, ELKE HATTTINGEN’, ANDREAS RAABE AND VOLKER SIEFERT Department of Neurosurgery, Johann Worfgang Goethe-University, Schleusenweg 2-16, 60528 Frankjiurt/Main, Germany
Since the introduction of f-MRI in the early ~ O ’ S this , technique became important to the neurosurgeon at four key stages in the clinical management of his patients: 1) assessing the feasibility of radical surgical resection, 2) surgical planning, 3) selecting patients for invasive functional mapping procedures and 4) intraoperative visualization of functional areas. In this prospective study we examined the occurrence of a new postoperative motor deficit as a function of a lesions distance to the functional areas as provided by functional magnetic resonance imaging. 54 patients with lesions in close proximity to the motor cortex were included in the protocol. Preoperative EPI T2* BOLD imaging was performed during standardized paradigms for hand, foot and tongue movement. Data analysis was done with Brainvoyager software (Brain Innovation, Maastricht, NL). For functional neuronavigation we use the Vector Vision2 system (BrainLAE3, Heimstetten, Germany). Outcome was analyzed as a function of a lesions distance to functional motor areas, resection grade, lesion size, patient age and histology. Regarding to the distance of a lesion to motor areas (=LAD), four risk groups were graded. Gross total resection was achieved in 47 patients. 9 patients (low grade glioma = 3, glioblastoma = 6) had a subtotal resection. The neurological outcome improved in 16 patients (30%), was unchanged in 29 patients (53%) and deteriorated in 9 patients (17%). Significant predictors of a new neurological deficit were a lesion to activation distance < 5mm (p<0,009) and subtotal resection (p<0,014). The determination of a lesion’s proximity to the motor primary cortex, based on preoperative functional MRI, may be a key for predicting the risk of postoperative deterioration. Our data suggest that a lesion to activation distance < 5mm is associated with a higher risk for neurological deterioration. Within a 10 mm range cortical stimulation should be performed. For a lesion to activation distance > 10 nun a complete resection can be achieved safely.
1.
Introduction
Preservation of function during surgery of lesions in the central region is a challenge for neurosurgeons, especially in neurologically intact patients.
’ Institute of Neuroradiaologv, Johann Wolfgang Goethe-University, Schleusenweg 2- 16, 60528 Frankjkrt/Main, Germany
223
224
Surgical resection is the most effective treatment for intra-axial tumors, primary or metastatic and patient survival and quality of life is correlated to the extent of resection [1,2,3,2,3]. The treatment offered to a patient is based on best knowledge about the natural course, dignity, invasive character and location of a lesion as well as pre-existing and expected neurological deficits when performing surgery. Functional MRI, based on the BOLD contrast [4], is capable of evaluating cortical activity non-invasively and task-specific [5,6]. The objective of our prospective study was to investigate whether f-MRI allows predicting the risk of a new postoperative motor deficit. The proximity of a lesion to eloquent motor cortex, as shown by f-MRI activation, was measured and compared to the patient’s postoperative neurological status and the type of resection for different lesions in the perirolandic area. Moreover, we sought to use f-MRI to better define which patients do benefit from intraoperative cortical simulation and mapping. 2.
Patients and Methods
Fifty-four consecutive patients (25 female and 29 male, mean age 52,3 years, range 22 - 75 years) with lesions close to the motor strip (Table 1) were selected as candidates for functional neuronavigation between June 2000 and September 2002. Functional MRI was performed before surgery and integrated in a neuronavigation 3D data set. Each patient gave a written informed consent. Intracranial pathologies included glioblastoma in 16 cases, meningioma in 12 cases, metastases in 10 cases (one case with multiple metastases), glioma grade 1-11 in 7 cases, glioma grade 111 in 5 cases, cavernoma in 2 cases, ependymoma in 1 case and an AVM in 1 case. Lesions were located in the left hemisphere in 29 cases and in the right hemisphere in 26 cases. 2.1. Functional MRl Functional magnetic resonance imaging was performed with a 1,5 Tesla Siemens Magnetom Vision scanner (Siemens AG, Erlangen, Germany) using the echo-planar T2* imaging technique (EPI; 15 slices, slice thickness 6mm, slice distance 0,6mm, TE 66ms, TR 4sec, matrix 128x128, FOV 230mm, 120 measurements, 3 dummy measurements). The paradigm used consisted of alternating tongue movements along the teeth as well as movements of the hand (finger tapping) and foot (toe flexing) contra-lateral to the affected side. No resting condition was implemented. During each examination a T1 weighted 3D MP-RAGE data set was acquired as anatomical reference scan. The voxel size of the images was isometric with lxlxlmm.
225 Data analysis was performed using the Brainvoyager 2000 Software (Brain Innovation, Maastricht, NL). 2D- and 3D-functional maps were calculated based on a cross correlation analysis between the measured and presumed signal time course for each voxel after motion correction. The resting phase for one condition was defined by the other two conditions, e.g. the time periods of hand and foot movements were used as the resting period for the calculation of the tongue activation map.
2.2. Neuronavigation We use the BrainLAB VectorVision’ neuronavigation system (BrainLAB, Heimstetten, Germany) [7,8]. This image-guided, frameless and armless localization system is based on passive marker reflection of infrared flashes. It consists of a planning workstation (Pentium 11, 400 MHz, 64 MB RAM, 6 GB HD) with software tools for image transfer, co-registration of multi-modal image sets and surgical planning. A second unit is located in the OR, which consists of a workstation, a Polaris CCD camera array, a touch screen, which can be draped sterile and placed in the operating field, an array of infrared light emitting sources and a pointer device or/and an adapter device for microscope tracking. A lesion was estimated small if its volume was 0,l-10 ccm. The lesion was called medium size if the volume was 10,l-60 ccm and a large lesion was defined for a lesion volume >60,1 ccm. 2.3. Image fusion For image fusion a fully automatic intensity based, commercially available algorithm was used, which is capable of registering and fusing medical image sets, regardless of the imaging modality and the scan orientation of the image sets. With this technique multiple image sets were aligned, mainly 3D MP RAGE neuronavigation T1 MRI and f-MRI image sets. 2.4. Analysis of lesion distance to functional areas
The distance of the f-MRI areas to the lesion were measured in three dimensions with the VVplanning 1.3 software (BrainLAB, Heimstetten, Germany). The closest distance between the border of a functional area and the border of a lesion was calculated in the composed image set after image fusion, containing both anatomical and functional information. The distances of each functional area was noted in a database.
226 2.5. Surgical procedures All craniotomies were performed under general anesthesia without muscle relaxation, with the patients head immobilized in a Mayfield headrest system (OMI, Inc., Cincinnati, OH, USA). A rigid mechanical connection was established between the head holder and the reference star. Extreme positioning (full lateral positioning, full flexion or extension of the head) was avoided to minimize brain shift and to avoid venous congestion. After segmentation of the lesion and the functional areas the contours of these three-dimensional objects are available on-line to the surgeon during operation being displayed in the eye pieces of the microscope. A color code gives immediate feed-back on a sterile draped monitor in the operative field about the nature of the displayed contour. 2.6. Intraoperative electrical stimulation and correlation with f-MRI
The indication to and the procedure of electrocortical stimulation and mapping was standardized as follows: As neuronavigation was used in all cases, the craniotomy was tailored according to the exact extension of the lesion and a margin of 10 mm for opening of the dura mater. The exposed area was checked for typical landmarks of the motor cortex according to the neuronavigation data set (T1 weighted MR images). In those cases where the motor strip and the confining sulci could reliably be identified and could be found outside the dural opening, no electrocortical stimulation was performed. In cases, where the motor strip was at least partially exposed, cortical mapping was performed. Electrocortical stimulation was also performed in cases without reliable anatomical landmarks of the motor strip and cases with deep seated lesions to use continuous cortical stimulation as a monitoring for the pyramidal tract. For stimulation we used the EWACS IOM System 916, Software Version 2.1 (homed, Teningen, Germany) and monopolar current with 50-500 Hz and 5-20 mA.
2.7. Data analysis and statistical analysis Depending on the distance between the functional areas, as defined by f-MRI activation and the lesion a subdivision into four groups was made. The outcome of all patients in a particular group was compared to lesion size, age, histology, type of resection and distance to the closest f-MRI area and statistically tested with the Pearson chi-square test with commercially available software (SPSS for Windows 11.5, Chicago, IL, U.S.A.). Significance was considered as p < 0,05.
227 3.
Results
3.1. Group IV (Distance >I5 mm between f-MRI area and lesion) This group consisted of 7 patients, 3 female, 4 male, mean age 46,6 years, range 22-63 years, with glioblastoma (n=3), metastasis (n=2) AVM (n=l) and cavernoma (n=l). The lesions in this group were rather small (mean 6,2 ccm, range 0,4 - 14,s ccm) and were situated in the left hemisphere in 4 cases and in the right hemisphere in 3 cases. Functional MRI experiments were performed for hand movement in all cases, tongue movement in 6 cases and foot movement in 3 cases. All three areas were included in the neuronavigation data set for 3 patients, two functional maps in 3 cases, and 1 area in 1 case. Gross total resection was achieved in six cases. One patient with a glioblastoma had a subtotal resection. In this group no neurological deteriorations were seen, 4 patients were unchanged (53%) and 3 patients improved after surgery (47%). 3.2. Group III (Distance 10-15 mm between f-MRI area and lesion) This group consisted of 15 patients, 7 female and 8 male, mean age 51,3 years, range 29-74 years, with glioma WHO 11 (n=3), glioma WHO 111 (n=3), glioblastoma (n=3), meningioma (n=2), metastasis (n=3), and cavernoma (n= 1). The lesions in this group were small (0,l-10 ccm) in 4 cases, medium size (1060 ccm) in 8 cases and large (>60 ccm) in 3 cases. Localization was left in 7 cases and right hemispheric in 8 cases. Functional MRI experiments were performed for hand movement in all cases, tongue movement in 12 cases and foot movement in 9 cases. All three areas were included for neuronavigation in 8 patients, two functional maps in 5 cases, and 1 area in 2 cases. Gross total resection was achieved in 13 patients and two patients had a subtotal resection, one with a glioma WHO I1 and one with an oligodendroglioma. 9 patients had an unchanged neurological status (60%), and 4 patients improved after surgery (26,7%). Neurological deterioration was seen in 2 patients (13,3%), one after reoperation for an epi- and subdural hematoma and one with persistent postictal hemiplegia of her right arm. However, these deficits were not related to the surgical manipulation as they were not caused by direct damage of the motor cortex when removing the lesion. 3.3. Group II (Distance 5-10 mm between f-MRl area and lesion)
This group consisted of 12 patients, 2 female and 10 male, mean age 54,31 years, range 27-75 years, with glioma WHO I1 (n=2), glioma WHO 111 (n=l), glioblastoma (n=3), meningioma (n=4), metastasis (n=l), and ependymoma
228 (n=l). The lesions in this group were small (0,l-10 ccm) in 2 cases, medium size (10-60 ccm) in 8 cases and large (>60 ccm) in 2 cases. Localization was left in 7 cases and right in 5 cases. Functional MRI experiments were performed for hand movement in 11 cases, additional tongue movement in 7 cases and foot movement in 9 cases. All three areas were included in 5 patients, two functional maps in 5 cases, and 1 area in 2 cases. Gross total resection was achieved in 9 cases, as confirmed by postoperative MRI. Two patients, one with a glioblastoma and one with an oligodendroglioma had a subtotal resection. No neurological deteriorations were seen, 6 patients were neurologically unchanged (50%) and 6 patients improved after surgery (50%). 3.4. Group I (Distance 0-5 m m between f-MRI area and lesion) This largest group consisted of 20 patients, 13 female and 7 male, mean age 53,9 years, range 26-71 years, with glioma WHO I1 (n=l), glioma WHO Ill (n=2), glioblastoma (n=7), meningioma (n=6), and metastasis (n=4). The lesions in this group were small (0,l-10 ccm) in 8 cases, medium size (10-60 ccm) in 9 cases and large (>60 ccm) in 3 cases. Localization was equal left and right hemispheric in 10 cases. Functional MRI experiments were performed for hand movement in all cases, additional tongue movement in 17 cases and foot movement in 14 cases. All three areas were included in 12 patients, two hnctional maps in 8 cases. Gross total resection was achieved in 17 cases. Two patients with a glioblastoma and one patient with a low-grade glioma had subtotal resection. 3 patients improved after surgery (15%) and 10 patients had an unchanged neurological status (50%). New postoperative motor deficits were seen in 7 patients (35%). In five cases the neurological deteriorations were attributable to the surgical manipulation of the motor cortex during tumor removal. In our study we have used functional MFU for grading the risk of a new postoperative motor deficit. The overall morbidity rate in our patients was 16,7% (9 patients), 53,7% had an unchanged neurological status (29 patients) and 29,6% (16 patients) improved after surgery. A gross total tumor resection was achieved in 83,3% (45 patients), the remainder 16,7% (9 patients) had subtotal tumor resection (glioblastoma, n=5; oligodendroglioma, n=2; glioma WHO 11, n=2). No partial resections or biopsies were performed. Analyzing the outcome with regard to the distance of a lesion to f-MRI activation (lesion to activation distance = LAD) comes to more specific results. We did not observe postoperative motor deficits related to damage of the primary motor cortex during surgery for lesions with a LAD > 5 mm, but for a LAD < 5 mm this rate was 35% respectively. The group with a LAD of 5-10
229
mm did distinctively benefit from surgery with a neurological improvement in 50% of the cases. Neurological deterioration was significantly correlated to a LAD < 5mm (p < 0,009) and subtotal resection (p < 0,014). The relative risk (odds ratio) for a neurological deterioration in risk group 1 (LAD 5 mm) was 7,8 (95 percent confidence interval 1,4 to 42,l). There was no correlation to lesion size, age, histology or gender. 4.
Conclusion
Despite of these methodological limitations [lo,] I], we believe this technique is of primary importance to the neurosurgeon at four key stages in the clinical management of his patients: 1) assessing the feasibility of radical surgical resection, 2) surgical planning, 3) selecting patients for invasive functional mapping procedures and 4) intraoperative visualization of functional areas. The lesion’s distance to motor areas can be judged and an individual data based risk profile can be established before the actual operation. This improves the quality of patient counseling and helps the surgeon to decide which tumors require intraoperative mapping. In this prospective study we examined the occurrence of a new postoperative motor deficit as a function of a lesions distance to the functional areas as provided by functional magnetic resonance imaging. We conclude that lesions < 5 mm distant from the functional areas hold an intrinsic high risk for postoperative motor deficits. Lesions > 10 mm from the functional areas can be resected safely. The clinical value of this study consists of defining a clinical application for f-MRI in patients with lesions adjacent to the motor cortex. Provided, that the methodology is reliable, f-MRl can be used as a clinical tool to decide whether a given lesion requires intraoperative cortical mapping. The determination of a lesion’s proximity to the motor primary cortex, based on preoperative functional MRI, may be a key for predicting the risk of postoperative deterioration.
References 1. M.Lacroix, D.Abi-Said, D.R.Fourney et al, A multivariate analysis of 416
patients with glioblastoma multiforme: prognosis, extent of resection, and survival, J Neurosurg (2001) 190- 198 2. APeraud, H.Ansari, K.Bise, H.J.Reulen, Clinical outcome of supratentorial astrocytoma WHO grade 11, Acta Neurochir. (Wien. ) (1998) 1213- 1222 3. S.L.Stafford, A.Perry, V.J.Suman et al, Primarily resected meningiomas: outcome and prognostic factors in 581 Mayo Clinic patients, 1978 through 1988, Mayo Clin. Proc. (1998) 936- 942
230
4. S.Ogawa, T.M.Lee, A.R.Kay, D.W.Tank, Brain magnetic resonance imaging with contrast dependent on blood oxygenation, Proc. Natl. Acad. Sci. U. S. A (1990) 9868- 9872 5. S.W.Atlas, R.S.Howard, J.Maldjian et al, Functional magnetic resonance imaging of regional brain activity in patients with intracerebral gliomas: findings and implications for clinical management, Neurosurgery (1996) 329- 338 6. J.A.Maldjian, M.Schulder, W.C.Liu et al, lntraoperative functional MRl using a real-time neurosurgical navigation system, J . Comput. Assist. Tomogr. (1997) 9 10- 9 12 7. HKGumprecht, D.C.Widenka, C.B.Lumenta, BrainLab VectorVision Neuronavigation System: technology and clinical experiences in 131 cases, Neurosurgery (1999) 97- 104 8. A.Muacevic, H.J.Steiger, Computer-assisted resection of cerebral arteriovenous malformations, Neurosurgery (1999) 1 164- 1170 9. U.Sure, L.Benes, T.Riege1, D.M.Schulte, H.Bertalanffy, Image fusion for skull base neuronavigation. Technical note, Neurol. Med. Chir (Tokyo) (2002) 458- 46 1 10. D.L.Hi11, A.D.Smith, A.Simmons et al, Sources of error in comparing functional magnetic resonance imaging and invasive electrophysiological recordings, J. Neurosurg. (2000) 2 14- 223 11. F.E.Roux, DJbarrola, M.Tremoulet et al, Methodological and technical issues for integrating functional magnetic resonance imaging data in a neuronavigational system, Neurosurgery (2001) 1145- 1156
TRENDS AND PERSPECTIVES IN COMPUTER-ASSISTED DENTAL IMPLANTOLOGY KURT SCHICHO, ROLF EWERS, ARNE WAGNER, RUDOLF SEEMANN, GERT WITTWER~ University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical SchoolUniversity of Vienna, Waehringer Guertel 18-20, 1090 Vienna, Austria
This article gives a survey of current trends in computer assisted navigation in dental implantology. Beginning from 1995 until the end of year 2003 at our clinic 72 patients have been supplied with 395 dental implants with support of computer assisted navigation technology, including 17 patients with transmucosal interforaminal dental implants (68 implants). State-of-the-art navigation software for dental implantology is suited for routine clinical application, commercially available systems provide sufficient precision and reliability also in difficult implantological cases. Recent developments focus on the support of implantologists during the planning phase by means of interactive teleconsultation via internet.
1. Introduction Since the middle of the 1990ies application of computer-assisted navigation technology is gaining in significance in dental implantology, allowing an accurate intraoperative realization of preoperative planning according to prosthetical, functional and aesthetical criteria. Basic research has been followed by the development of commercially distributed systems. They are based on similar technical priciples as navigation systems used in other applications, but provide software that is optimized for implantology and therefore easy to use [ 141. Computer assisted navigation contributes to improved intraoperative safety and helps the surgeon to avoid damages of anatomical structures (e.g. nerves). Preoperative evaluation of available bone using computer-generated 2Dand 3D- renderings from the CT-scan is the base for the definition of implant positions and therefore for an optimization of the mechanical load situation. In several studies sufficient accuracy of navigated implant placement could be proven, also in difficult implantological cases or e.g. in the proximity to the mental foramen [5-81.
2.
Material and Methods
Since 1995 at our clinic 72 computer assisted dental implantations have been finished successfully. A total of 395 implants has been positioned by means of
0 This work was funded from an industrial cooperation managed by Gert Wittwer MD DMD 231
232 this technology, including 68 transmucosal interforaminal implants. Due to improved software (especially developed for dental implantology, e.g. the ,,Virtual ImplantTM"by Artma, Vienna) the time necessary to prepare a surgical intervention with navigation significantly decreased since the initial phase. Positioning of fiducial markers at the patient, CT-scanning, preoperative planning, sterilization of instruments (trackers, ...), setup of the navigation system required about 2-3 days in the inital phase of routine clinical application and is now reduced to approximately half a day [ 11. At our clinic the Medtronic TreonTMStealthStationTM,a well known system e.g. in neurosurgery, was applied for transmucosal interforaminal implantation for the first time in 9 cases. The transmucosal implantation method has the potential to reduce postoperative pain, soft tissue swelling and can improve patient's comfort.
Figure 1 . Preoperative planning of dental implants with the Medtronic TreonTMStealthStationTM.
233
3.
Preliminary clinical results of transmucosal interforaminal implantation with computer assisted navigation
During the first follow-up of 6 months at two patients the loss of 1 implant each was observed because of incompatibilities of the applied implant procedure. Preoperative planning and intraoperative application of the navigation system could be accomplished without shortcomings. Technical setup and preparation (including a routine check of navigation tools before sterilization and system components) could be fulfilled within an average duration of half a day. This initial application of the Medtronic TreonTMStealthStationTMin dental implantology showed stable and reliable performance in all 9 cases, but menu structure and handling of the software would have to be adapted for the specific tasks in dental implantology to improve the suitability for routine clinical use.
Figure 2. Intraoperative navigation: The “cyclone” in the right lower window (concentric circles around the green dot) indicates the intraoperative angular orientation of the drill, the blue beam on the right edge of this window informs about the current lenght of the drilling channel.
234 4. Teleconsultation Commercially available systems for implant navigation are friendly-to-use. However, the planning of implants requires specific prosthetic knowledge, which can be provided e.g. by qualified experts in implant-competence-centers. The availability of prosthetic knowledge can be seen as a limiting factor for the acceptance of navigation systems. By means of teleconsultation this specific knowledge can be transfered from remote experts to the surgeon who performs the treatment. In current studies we try to apply and to evaluate this technology in detail. New telecommunication-technologies like UMTS (Universal Mobile Telecommunications System) enable teleplanning with a high degree of interactivity. Preoperative transfer of CT-data from the surgeon to the competence-center, planning, transfer of the planning from the center to the operation site, feedback and optionally online-modification of the planning (additional or fewer implants, ...) are the main steps in a teleplanning procedure that has the potential to improve the qualitiy of the treatment.
5. Conclusion Our preliminary results with transmucosal interforaminal implantation correspond with the expectations from the clinical point of view. The technical realization of the transmucosal concept is already applicable in the daily routine. Further developments concerning interactive teleconsultation for implantplanning via internet have the potential to make highly specialized prosthetic knowledge available independent from the geographic location of the surgical intervention.
Acknowledgments We kindly acknowledge advice and technical support from 0 Dr. Michael Truppe, Medlibre ForschungsgesellschaftmbH, Germany (www.medlibre.org.) 0 Dr. Dietmar Legenstein, Medtronic Austria and 0 Telekom Austria and Mobilkom Austria (cooperation in the field of telemedicine via UMTS).
235 References 1. Ewers R, Schicho K, Seemann R, Reichwein A, Figl M, Wagner A. Computer aided navigation in dental implantology: 7 years of clinical experience. Journal of Oral and Maxillofacial Surgery; in press. 2. Schicho K, Ewers R. Teleconsultation and 3D-Visulatisation in computerassisted Implantology: The current state of development (german). Impl J 5: 86-88,2001. 3. Watzinger F, Birkfellner W, Wanschitz F, Millesi W, Schopper C, Sink0 K, Huber IS, Bergmann H, Ewers R. Positioning of dental implants using computer-aided navigation and an optical tracking system: case report and presentation of a new method. J Craniomaxillofac Surg 1999:27: 77-81. 4. Ploder 0, Wagner A, Enislidis G, Ewers R. Computer-assisted intraoperative visualization of dental implants. Augmented reality in medicine. Radiologe 1995:35: 569-572. 5. Siessegger M, Schneider BT, Mischkowski RA, Lazar F, Krug B, Klesper B, Zoller JE. Use of an image-guided navigation system in dental implant surgery in anatomically complex operation sites. J Craniomaxillofac Surg 29(5):276-281,2001. 6. Wanschitz F, Birkfellner W, Watzinger F, Schopper C, Patruta S, Kainberger F, Figl M, Kettenbach J, Bergmann H, Ewers R. Evaluation of accuracy of computer-aided intraoperative positioning of endosseous oral implants in the edentulous mandible. Clin Oral Implants Res 2002:13: 5964. 7. Wagner A, Wanschitz F, Birkfellner W, Zauza K, Watzinger F, Schicho K, Kainberger F, Czerny C, Bergmann H, Ewers R. Computer-Aided Placement of Endosseous Oral Implants in Patients after Ablative Tumor Surgery: Assessment of Accuracy. Clin Oral Impl Res 14:340-348,2003. 8. Cavalcanti MG, Ruprecht A, Vannier MW. 3D volume rendering using multislice CT for dental implants. Dentomaxillofac Radio1 3 1:218-223, 2002.
THE EXPERIENCE OF THE WORKING GROUP FOR COMPUTER ASSISTED SURGERY AT THE UNIVERSITY OF COLOGNE R.A. MISCHKOWSKI, M. ZINSER, M. SIESSEGGER, A.C. KUBLER, J. E. ZOLLER Department ofCraniomaxillofacia1 and Plastic Surgery University of Cologne Kerpener Str. 62 5093 7 Koln Germany
[email protected]
This paper reports the experience of the interdisciplinary working group for computer assisted surgery which has been established 1999 at the University of Cologne. The main area of activity was intraoperative navigation using the infrared based “Vector Vision2” and “RoboDent” system. 582 image guided procedures have been performed between 8/1999 and 11/2003. The most frequent indication in the head and neck area consisted in removal of tumors followed by biopsies, removal of foreign bodies and insertion of dental implants. The accuracy of used image guided navigation systems has been evaluated in clinical and experimental studies. Innovative techniques supporting the navigation were implemented and assessed for clinical use.
1.
Introduction
With the introduction of reliable image guided navigation systems an interdisciplinary working group for computer assisted surgery (Arbeitskreis Computer Assistierte Chirurgie = ACAS) has been established 1999 at the University of Cologne. The participating specialities are Craniomaxillofacial Surgery, Neurosurgery, ENT, Ophthalmology and Trauma Surgery. In monthly conferences complex multidisciplinary cases are presented and their management discussed. Next to clinical issues, scientific investigations are planned and technical innovations demonstrated and reviewed.
2.
Methods
For intraoperative navigation the infrared based system “Vector Vision2” (BrainLAl3, Heimstetten, Germany) was used. Image guided insertion of enossal implants was performed with the infrared based “RoboDent” system (RoboDent, Berlin). The imaging data for the navigation procedures consisted of CT and MFU scans which were obtained using Somatom’ CT and Magnetom@MRI systems (Siemens, Munchen). The Magnetom@MRI system
236
237 provided - beside excellent image quality - additional features such as MRAngiography, functional MRI and MR-spectroscopy which were helpful for preoperative planning in selected neurosurgical cases. The imaging data were transferred in DICOM format to the planning workstation via an Intranet connection. Preoperative planning and intraoperative navigation were performed using “Vector Vision cranial” sofware suite. For referencing purposes several methods as skin fixed fiducials, headset and laser surface scanning (Z-Touch) were applied. The referencing was carried out in a standard manner with the patients head fixed in the Mayfield clamp means the skull reference array. In minor procedures, especially when performed in local anaesthesia, a flexible Velcro reference headband was used. The “RoboDent” system for placement of enossal implants utilizes unlike the “Vector Vision’” dental fixed splints as referencing device and thus allow for navigation within the moveable mandibular area.
3.
Results
Between 8/1999 and 11/2003 a total of 582 image guided navigation procedures were carried out in 470 patients. 65 operations (11,2%) were performed interdisciplinary. Within the head and neck area the most frequent indication was removal of tumors in 199 cases followed by biopsies in areas difficult to access in 83 cases, removal of foreign bodies in 23 cases and insertion of dental implants in 18 cases. Non-oncological neurosurgical procedures were performed in 24 cases. Next to the application in the head and neck area, the image guided navigation system was successfully used in the trauma surgery for insertion of pedicle screws in vertebral fractures. Since 12/2000 123 patients were operated on using the intraoperative guidance based on CT scans. A prospective study on 50 patients showed superior results regarding the precision, surgery time and the functional outcome compared with a matched historical collective treated in traditional way’*2.
238 Tablc I . The application of image guided navigation for diffcrcnt indications bctwccn 811 999 and
11/2003.
Oncology Biopsies
Foreign Body
Implants
Neurosurgery
Trauma surgery
Total
Patients
199
03
23
18
24
123
470
Procedures
253
96
25
61
24
I23
582
A study evaluating the precision of employed referencing methods showed the best results for headset referencing followed by skin fiducials and laser scanning. Innovative techniques as image guided microscope support, new hardware and software tools for preoperative planning4 and intraoperative ultrasound visualisation were succsessfully implemented and tested in the clincal environment. 4.
Conclusion
The expirience of the working group for computer assisted surgery (ACAS) at the University of Cologne confirm the benefits of image guided navigation for surgical procedures in the head and neck area. The current evolution level of the systems allow for a reliable and precise intraoperative guidance mainly in oncologic procedures, implant placement' and removal of foreign b~dies~,~,'. Beside the head and neck area, image guided navigation has been proved as essential valuable for the treatment of vertebral fractures in the field of trauma surgery. Recent development aims at the utilization of image guided navigation in new craniofacial indication areas as minimal-invasive treatment of mandibular fractures and orthognatic surgery.
References 1. Hahn U, Prokop A, Jubel A, Siessegger M, Rehm KE. Prazision der computer-navigierten Pedikelbohrung in der frakturierten Wirbelsaule. In: Kirschner P, Stunner KM, eds. 65. Jahrestagung der Deutschen Gesellschaft fiir Unfallchirurgie e.V., 14.-17.11.2001. Hefte zur Zeitschrift "Der Unfallchirurg" 283 ed. Berlin: Springer, 111 (2001). 2. Hahn U, Andermahr A, Isenberg J, Jubel A, Prokop A, Rehm KE. Computer-navigierte Pedikelbohrung mit und ohne Laminektomie als
239
3.
4.
5.
6.
7.
Notfalleingriff bei Wirbelfrakturen mit progredienter Neurologie - Ein Zeitproblem? In: Rehm KE, Stunner KM, Prokop A, eds. 66. Jahrestagung der Deutschen Gesellschaft fur Unfallchirurgie e.V., 13.-15.11.2002. Hefte zur Zeitschrift "Der Unfallchirurg" 284 ed. Berlin: Springer, 106-107 (2002). Mischkowski RA, Siessegger M, Zoller JE. Computer assisted removal of foreign bodies in the head and neck area. In: Lemke HU, Vannier MW, Inamura K, Farman AG, Doi K, editors. CARS 2000. Computer Assisted Radiology and Surgery. Amsterdam: Elsevier, 66-69 (2000). Mischkowski RA, Bongartz J, Giel D, Frey S, Thelen A, Hering P. Holographische Gesichtsprofilmodelle zur Operationsplanung in der Mund,Kiefer- und Gesichtschirurgie. Quintessenz 54, 1027 - 1030 (2003). Siessegger M, Schneider BT, Mischkowski RA, Lazar F, Krug B, Klesper B, et al. Use of an image-guided navigation system in dental implant surgery in anatomically complex operation sites. J Craniomaxillofac Surg 29,276-281. (2001). Siessegger M, Mischkowski RA, Schneider BT, Krug B, Klesper B, Zoller JE. Image guided surgical navigation for removal of foreign bodies in the head and neck. J Craniomaxillofac Surg 29,321-325 (2001). Siessegger M, Mischowski RA, Neugebauer J, Krug B, Zoller JE. Die chirurgische Entfernung von iatrogen luxierten Zahnen. Computerunterstutzte Verfahren im klinischen Einsatz. Quintessenz 54, 479-484 (2003).
FLUOROSCOPIC NAVIGATION OF THE DYNAMIC HIP SCREW (DHS): AN EXPERIMENTAL STUDY DANIEL KENDOFF, E N S GEERLING, MUSA CITAK, THOMAS GOSLING, TOBIAS m F N E R , CHRISTIAN KRETTEK Trauma Dept. Hannover Medical School ( M H ) Hannover, 30625, Germany
MAURICIO KFURI. JR. Alexander v. Humboldt Stiftung; Ribeirao Preto Medical School Sao Paulo, Brazil
Screw position in femoral head has been considered the most important predictive factor for mechanical failure in pertrochanteric osteosynthesis and is assisted by fluoroscopic control. Computer-assisted surgery (CAS) can be an alternative aiming the precise screw insertion with low radiation dose. In this study, artificial proximal femora were submitted to insertions of a conventional dynamic hip screw (DHS) and an anti-rotational screw (ARS). Three set-ups were tested: 1. conventional implantation with two c-arms using a guide wire, 2. drilling and implantation controlled by CAS solely with a 3.2 mm drill bit, then insertion of a guide wire and drilling for the dynamic hip screw. 3. After navigated drilling (3.2 mm) a fluoroscopic control was performed. Variables were used comparing methods and surgeons: operation time, radiation time, Tip-Apex Distance (TAD), and the insertion neck-shaft angles of DHS and ARS. Considering TAD as a precision parameter, CAS led to as accurate screw insertions as with fluoroscopic control with a total reduction of radiation time up to 93%. Further clinical studies need to show improvements and usefullness in clinical use.
1.
Introduction
Trocantheric fractures are common injuries related to high morbidity and mortality. [1,2] The dynamic hip screw has been the favored treatment in such a cases. [3] Although excellent results were described with this technique, the mechanical failure rate of sliding screws has been reported to vary from 8% to 23%. [4] The most predictive failure’s factor is the screw position in the femoral head. [5] Extended radiation time can accomplish the optimal placing of the screw by the traditional technique. Computer assisted surgery could be an alternative in order to achieve the screw ideal position under low doses of radiation. An experimental comparative study was developed to address this question. 240
24 1
2.
Materials and Methods
The dynamic hip screw (Synthes, Germany) was inserted in 80 proximal foam femora, divided in four equal groups.
Figure 1 : Intraoperative Set-Up of navigated DHS insertion
~
Figure 2: Intraoperative planning of trajectories on the navigation system.
In the Group 1 the insertion of the screw followed under fluoroscopic control. Navigation (Surgigate,C-arm module, Medivision,Switzerland) was
242
used to guide the screws placement in the other three groups, after initial planning of trajectories on the system was done. Based on the argument that the drill diameter could interfere with navigation accuracy, we modified the usual technique by replacing the Kirschner guide wire with a 3,2mm drill (Groups 2 and 3) or a 4,5mm drill (Group 4) Differentiating the groups 2 and 3, a fluoroscopic control was added in the Group 2 to verify the position of the guide drill before opening the definitive screw hole. Additionally, an anti-rotational 6,5mm screw (Synthes) was inserted under fluoroscopic control (Group 1) or navigated control (Groups 2, 3 and 4). Operative time and radiation time were recorded. Anterior-posterior and lateral radiographies were done postoperatively enabling the position of screws analysis by the TAD index. [5] The angle between each screw and dyaphisis was assessed in order to define the parallelism between the sliding screw and the anti-rotational screw. A t-test was used to interpret the results.
Figure 3: Navigated insertion of a 3.2 m m drill as first step of operative procedure
3. Results The medium operative time was 14,8 min by the fluoroscopic group and 16,l min by the navigated groups. (p
243
groups. (p
4.
Discussion and Conclusion
The navigated DHS technique depends on planning and functional set-up of the operation theater, which normally adds time to the normal operation. The navigation offers precise insertion of the DHS with a dramatic reduction of radiation exposure of 94 %. No significant difference in the use of navigation between beginner and expert surgeons was found. Regarding the drilling precision CAS offers a reproducible technique independent of surgical experience. Even if single fluoroscopic control of drill positioning before definite DHS insertion shows no different in precision, recommendation is necessary, because following steps are not reversible. Further implant related implementations need to be integrated into navigation to optimize specific surgical procedures.
References 1.
2. 3.
4. 5.
Bannister GC, Gibson AGF et al. The fixation and prognosis of trochanteric fractures; a randomized prospective controlled trial. Clin Orthop 254: 242-246, 1990. Kyle RF, Gustilo RE%,Premer RF. Analysis of six hundred and twenty two intertrochanteric hip fractures. J Bone Joint Surg Am 61(2): 2 16-221 Cole JD, Ansel LJ. Intramedullary nail and lag-screw fixation of proximal femur fractures- operative technique and preliminary results. Orthop Rev 23:35-44, 1994 Simpson AH, Varty K, Dodd CA. Sliding hip screws: modes of failure. Injury 20:227-231, 1989 Baumgaertner MR, Curtin SL, Lindskog BA, Keggi JM. The value of the Tip-Apex Distance in predicting failure of fixation of perithochanteric fractures of the hip. J Bone Joint Surg. Am 77(7): 1058-1064, 1995.
IMAGE-GUIDED NAVIGATION FOR INTERSTITIAL LASER TREATMENT OF VASCULAR MALFORMATIONS IN THE HEAD AND NECK J. HOFFMANN, C. WESTENDORFF, D. TROITZSCH, U. ERNEMANN*, S. REINERT Department of Oral and Maxillofacial Surgery. Tiibingen University Hospital, Osianderstrasse 2-8. 72076 Tiibingen, Germany, * Department of Neuroradiology, Tiibingen University Hospital, Hoppe-Seyler-Strasse, 72076 Tiibingen, Germany
Laser-induced interstitial thermal therapy (LITT) is a minimally invasivc surgical technique for the treatment of vascular malformations. As the technique is interstitial, the placement of the laser fiber occurs remotely from the surgeon and is not visible as would be the case in an open surgical procedure or by percutaneous laser application. Multiple image data navigation guided LITT was performed in 5 patients with giant venous malformations of the maxillofacial area. The system consisted of a special new developed Nd:YAG laser fiber introducer set in conjunction with fused computed tomography and magnetic resonance based surgical navigation. By 3-D reconstruction for laser surgery planning and determining target areas for laser probe navigation, interstitial laser treatment could be performed exactly. In all cases, control examination clearly showed a diminished tumor volume and all patients reported subjective amclioration. The results suggest that navigation-guided LITT can be performed safely with preserving vital structures from damage. Image guided navigation-controlled LITT offers a non-invasive safe treatment option in complex vascular malformations.
1.
Background
Several computer assisted navigation systems are in use in modem surgery’-5. Enabling the surgeon to locate the continually updated position of surgical instruments on a monitor displaying a patient’s three-dimensionally (3D) reconstructed computed tomography (CT) and magnetic resonance imaging (MRI) scan data these systems have become a useful tool in the operating room leading to optimized clinical results and minimized surgical risks.2Furthermore, navigation systems allow surgeons to plan therapy performance pre~peratively.~ Particularly when surgical procedures are complex they can be carried out according to a preoperative plan based on CT and MRI scan data.’ Laser-induced interstitial thermotherapy (LITT) has proved to be an effective and minimally invasive surgical technique for the treatment of a variety of vascular malformations, hemangiomas and tumor^.^.'^ LITT involves the percutaneous introduction of an optical fiber through a cannula. The heat that is generated at the bared fiber’s tip through delivered light creates a localized coagulative necrosis. The fiber is located interstitially and not visible 244
245
in contrast to open surgical procedure. Application of thermal energy occurs remotely from the surgeon.’, Control of treatment is needed. Solutions have been found through the use of radiologic imaging techniques offering accurate image-guided placement of the optical fiber into tumorous tissue and intraoperative imaging to monitor tissue changes. These solutions rely on the intraoperative use of MRI and/or ultrasound (US).93lo, I 6 Techniques often have been limited due to slow MRI image acquisition sequences and closed or open MRI scanners as well as restricted fields of application with regard to ultrasound
2.
Patients, Materials and Methods
Image-guided LITT was performed in 5 procedures oriented to venous malformations of the head and neck region (Figure 1). Before treatment informed consent was obtained, risks, benefits and therapeutic alternatives were explained to the patient. Preliminary color-coded duplex sonography using a Siemens Sonoline Elegra system was performed to document the extension of the vascular tissue in each case. All interventional procedures were done under general anesthesia. Before treatment MRI and CT scans of the patient’s head and neck area were performed with reproducible distraction of the occlusal plane using an individualized intermaxillary splint (Figure 1). An isotropic MRI data set which can provide high-resolution axial, coronal and sagittal reconstructions was acquired to delineate the extent of the venous malformation. Bony structures of the face and skull base were visualized in a spiral CT scan. Radiologic data were transferred via hospital network to the navigation system. The VectorVisionTM Compact Navigation System (BrainLAB, Munich, Germany) was used for preoperative planning and intraoperative navigation. Consisting of two infrared high precision cameras encircled by infrared light emitting diodes the VectorVisionTM-system works on passive marker technology. Via reflection of emitted infrared light by three reflective marker spheres adapted to patient and surgical instruments a computer workstation calculates the position of each surgical instrument in relation to the position of the patient. A monitor displays the location of instruments in 3D and real time projected onto CT and MRI scan. Furthermore, the software contains various tools for virtual pointer elongation, surgical path planning and linear measurement.
246
Figure 1.45-year-old patient with a large venous malformation of the right aspect the neck extending into the deep pharyngeal wall and leading to airway obstruction and severe dyspnoea. The arrow is pointing on an individualized intermaxillary splint needed for adynamic mandibular registration.
Location and extension of the lesions were visualized in CT and MRI scans on the navigation screen. CT and MRI scan data were merged, 3Dreconstruction of the patient’s head and neck was done, lesion margins were defined and an accurate puncture route was determined virtually. A Nd:YAG (neodymiumyttrium aluminium garnet) laser (MY 60, Martin Medizintechnik, Tuttlingen, Germany) emitting a wavelength of 1064 nm at 5 W through a 600 pm bare fiber in continuous mode was used. Direct puncturing of the lesion was performed using a rigid steel cannula with an external diameter of 16 Gauge and an internal lumen containing a sliding 600 pm bare fiber. The cannula was connected to a bare fiber holder adapted to the BrainLAB tool adapter, an instrument consisting of three reflective marker spheres enabling 3D-orientation of the device (Figure 2). This surgical instrument was calibrated to the navigation system displaying the 3D-location on the navigation screen in an interactive continually updated mode. The fiber was placed slidingly adjustable into the cannulation tool (Figure 2). The skull reference array was attached to the patient’s head and patient to image registration was performed using surface scanning with a class I laser device called Z-TouchTM. A tool adapter was fixed to the bare fiber holder; registration of instruments was then performed using the BrainLAB calibration
247 matrix (Figure 2). After calibration procedure the position of the cannula’s tip and trajectory was displayed on the navigation screen.
Figure 2. Toolset for computer assisted laser-induced thermotherapy: (a) Calibration tool. (b) 16 Gauge cannula with introduced 600 pm optical fiber. The cannula is connected to the bare fiber holder that carries the reference array for intraoperative registration. (c) A “Pointer” needful for localization and registration of anatomic landmarks.
Following disinfection of skin and fixing the patient’s mandible into the CT- and MRI-scanned position using the interocclusal splint the lesion was punctured and the cannula was positioned within the malformatted tissue by monitoring both direction of the cannula and position of the cannula’s tip on the navigation screen in accordance with preoperatively planned treatment (Figures 3 and 4). After navigated placement of the cannula via extra- and intraoral approach the fiber was then pushed straight-lined over a distance of 1 mm into the malformatted tissue. During the adjacently performed irradiation, the cannula with the containing bare fiber was drawn back slowly leaving coagulated tissue behind. After laser-emission the fiber was retracted and the cannula replaced. By repeating such proceeding for several times the extent of the treated area was
248
enlarged. A power of 5 W was chosen to prevent superficial layers from coagulation necrosis. After treatment extensive tissue-swelling of treated volumes was prevented by intravenous application of corticosteroids.
Figure 3. Screen shot of the navigation monitor displaying a cannulation procedure of a vascular malformation for laser-induced thermotherapy. The tip of the cannula is pointing at malformatted tissue.
3.
Results
The system was set up in short time, tracking of surgical instrument occurs wirelessly. There is no need for head fixation with a Mayfield clamp. Preoperative 3D-visualization and planning of target fixed surgery showed up as a helpful tool. Topographic relationship between the instrument and anatomical structures was given at any time. Intraoperatively, the correct position of the fiber’s tip was easily and clearly identified by monitoring the continually updated visualization of the cannula on both axial and coronal images of CT and MRI scans as well as on CT-MRI-merged scan (Figure 3).
249
LITT was performed at different anatomic sites, at each site at an output of 5 W. No major complications like nerve damage or skin necrosis occurred. Minor complications such as local swelling and pain were well controlled. The clinical follow-up showed impressive volume reductions of the venous malformations, ameliorating the patient’s functions.
Figure 4. The intraoperative setup during navigated laser-induced thermotherapy.
4.
Discussion
There are several reports demonstrating the efficacy of Nd:YAG laserinduced thermotherapy in the treatment of vascular malformations.6-’0* I6 Since all types of vascular malformation show no involution but expansion and progression during lifetime effective treatment is required.’, I Considering the deep penetration and wide scattering in tissue the treatment of vascular anomalies with laser-induced thermotherapy remains a surgical challenge especially in anatomical regions not easily accessible like the pharyngeal region.7-9,1 % 14
’
Image-guided treatment improves preoperative planning by visualizing the topography of the malformation and the surrounding tissue.’’ * Intraoperative navigation makes laser treatment more reliable by precisely showing the
250
margins of the vascular malformation, preserving vital structures, and guiding laser treatment to preplanned target^.^" Combined with computer assisted surgery it opens a new route of access not only for LITT but for all different kinds of surgery with many potential applications. Using navigated LITT the risk of side effects like damaging vital structures can be reduced. Furthermore, there is therapy-placement control in regions that are not easily controlled by ultrasound systems. However, the accuracy of computer assisted surgery is limited due to dependency on preoperatively performed CT and MRI scan data that are not simultaneously updated with intraoperative tissue alteration following surgical manip~lation.~ Nor is it possible to assess therapeutic profit intraoperatively. Further investigations (for example combining a navigation system with an ultrasound scanner or performing computer assisted surgery with continually updated navigation system data provided by an open-configuration MRI) leading to advanced methods need to be done.
References 1.
2. 3. 4. 5. 6.
7. 8. 9. 10. 11. 12.
13.
L. Joskowicz, C. Milgrom, A. Simkin, L. Tockus and Z. Yaniv, Comput AidedSurg. 3,271 (1998). K. Montgomery, M. Stephanides and S. Schendel, Comput Aided Surg. 5, 90 (2000). R. Marmulla, M. Hilbert and H. Niederdellmann, J Craniomaxillofac Surg. 25, 192 (1997). A. Wagner, 0. Ploder, G. Enislidis, M. Truppe and R. Ewers, Znt J Oral Maxillofac Surg. 25, 147 (1996). M. Kaus, R. Steinmeier, T. Sporer, 0. Ganslandt and R. Fahlbusch, Neurosurgery. 41, 143 1 (1997). T. J. Vogl, M. G. Mack, P. K. Muller, R. Straub, K. Engelmann and K. Eichler, Eur Radiol. 9, 1479 (1999). D. Cholewa, F. Wacker, A. Roggan and J. Waldschmidt, Lasers Surg Med. 23,250 (1998). F. K. Wacker, D. Cholewa, A. Roggan, A. Schilling, J. Waldschmidt and K. J. Wolf, Radiology. 208,789 (1998). N. Hayashi, T. Masumoto, T. Okubo, 0. Abe, N. Kaji, K. Tokioka, S. Aoki and K. Ohtomo, Radiology. 226,567 (2003). A. H. Dachman, J. A. McGehee, T. E. Beam, J. A. Burris and D. A. Powell, Radiology. 176, 129 (1990). A. Masters and S. G. Bown, Ann Chir Gynaecol. 79,244 (1990). B. M. Lippert, A. Teymoortash, B. J. Folz and J. A. Werner, Lasers Med Sci. 18, 19 (2003). L. D. Derby and D. W. Low, Ann PZast Surg. 38,371 (1997).
25 1
14. M. Poetke, C. Philipp and H. P. Berlien, Arch Dermatol. 136, 628 (2000). 15. S. Morikawa, T. Inubushi, Y. Kurumi, S. Naka, K. Sato, T. Tani, H. A. Haque, J. Tokuda and N. Hata, Acad Radiol. 10, 180 (2003). 16. D. Groot, J. Rao, P. Johnston and T. Nakatsui, Dermatol Surg. 29, 35 (2003).
MINIMALLY INVASIVE NAVIGATION-ASSISTED BONE TUMOR EXCISION IN THE PARIETAL SKULL J. HOFFMANN, D. TROITZSCH, C. WESTENDORFF, F. DAMMA"*, S. REINERT Department of Oral and Maxillofacial Surgeiy, Tiibingen University Hospital, Osianderstrasse 2-8, 72076 Tiibingen, Germany, * Department of Radiology, Tiibingen University Hospital, Hoppe-Seyler-Strasse, 72076 Tiibingen, Germany
Calvarial bone tumors are highly complicated to remove, especially those originating from the diploic space. Primary treatment consists in surgical resection. Because of the difficulty of precise localization surgical approaches are extensive, thus associated with significant bone deformity. The clinical advantages of navigation-assisted trcatmcnt of skull bone tumors arc apparent. A 39-year-old male patient with a history of renal cell carcinoma was admitted to our hospital because radionuclide scintigraphic bone scan revealed an increased uptake in a small area located at the left lateral skull bone. Thc high-resolution computed tomography showed that the lesion was located in the diploic space, destroying the inner table of the calvarium. The patient underwent minimally invasive bone lesion removal using an interactive image-guided approach. Complete resection of the neoplastic lesion was achieved. The histopathological examination revealed an ossifying fibroma. The postoperative course was uneventful and the patient was discharged three days after intervention. To date there was no evidence of local recurrence. The interactive multimodal planning and intraoperative image-guidance offers an interesting approach for biopsy and minimally invasive removal of small skull lesions especially in less accessible locations.
1.
Background
Ossifying fibroma is a rare, benign fibrous tumor showing local aggressiveness.'. In children, the most common site is the tibia, followed by other long bones.39 In adults, it mainly involves the mandible followed by maxillary sinus and rarely the skull bone, Ossifying fibroma lesions occur commonly during the first decade of life and present clinically as painless, slowly enlarging m a ~ s e sThe . ~ differential diagnosis between ossifying fibroma and fibrous dysplasia is exceedingly difficult with great impact on therapeutic consequences.' The radiologic picture of ossifying fibroma is characterized by a well-circumscribed lytic lesion, and on bone scan it shows increased ~ p t a k e . ~ Directed biopsy or excision may be required to determine its histologic nature, but localization of the site may often be delicate. As in older patients treatment consists of conservative curettage, in younger patients against the background of high rate of recurrence, surgical excision through extensive craniectomy has become first line therapy for ossifying fibroma of the skull, thus often causing significant cranial bone deformity with dissatisfying aesthetic and functional
252
253
’
results.6’ Recently, image-based surgical navigation systems have played an increasingly important role in surgical procedures in the settings of neurological and orthopaedic These systems allow surgical instrument tip position to be displayed nearly real-time interactively within 3D-imaging data. Pre-surgical image-based planning combined with minimally invasive surgical navigation was used to localize and remove a small lesion that was located in the diploic space, destroying the inner table of the calvarium.
2.
Materials and Methods
A 39-year-old male patient was referred to our department for assessment and resection of a small calvarial lesion (Figure 1).
Figure 1. Radionuclide bone scan revealed a circumscribed region of increased uptake in the left parietal bone (LEFT). Preoperative virtual reality-reconstructed three-dimensional computed tomography of the skull revealed a focal osteolytic lesion with an intact external calvarial table (RIGHT).
The patient had a history of left abdominal nephrectomy ten months before due to renal cell carcinoma @T1, NO, Mx). A routine follow-up scintigraphic bone scan revealed a small well-circumscribed region of increased radionuclide uptake of unknown etiology located at the left parietal bone. By the time of admission, the patient had no complaints, physical examination was unremarkable. There was no history of cranial trauma or pain. No solid prominence or soft-tissue swelling was palpable at the mentioned area. He was otherwise well with a normal neurological examination. High-resolution computed tomography (CT) (Siemens Somatom Sensation 16) of the calvarium was performed. CT data revealed a focal rounded
254
osteolytic, circumscribed lesion in the left parietal bone. The lesion, measuring approximately 0.5 inches in diameter (Figure 2), involved the diploic space and inner table of the calvarium. On admission, the appearance was worrisome for a lytic meastasis originating from the pre-existent renal cell cancer. The CT data was transmitted via hospital network to the navigation planning platform. The lesion area was detected in the multiplanar sections and the region of interest was then marked by color (Figure 2). The target point determined by the morphological CT-data was verified by rechecking the functional radionuclide scintigraphic scan.
Figure 2. Image-based surgical planning using high-resolution computed tomography image data. LEFT: Colored skull lesion in relation to surface 3D-reconstnrction. RIGHT: Colored region of interest in relation to 3D-bone reconstruction.
The navigation system (VectorVisionTM,BrainLAB, Munich, Germany) was used in this study. The VectorVisionTMworks on the basis of a “passive marker technology”, i.e. infrared light emitting diodes are positioned around these cameras, thus allowing arm and wireless tracking of pointers or surgical instruments fitted with reflective markers plugged on the universal tool adapters, was. The software contains various tools for virtual pointer elongation, surgical path planning and linear measurement. The operation was performed under general anaesthesia. After attaching the drill-fixed dynamic skull reference array to the patient’s head, the patient-toimage registration was performed using surface scanning with a class I laser device called “z-touchTM”allowing for matching of the patient’s head surface with the tomographic image data. After registration, the axial, coronal and sagittal reconstructions of the pointer tip position were displayed in real time on the navigation display. The registration accuracy was checked by identification
255 of anatomical landmarks. The localization of the lesion was identified by pointer guided navigation and marked on the skin (Figure 3).
Figure 3. Navigation system screen shot displaying a view of the preplanned surgical approach in all three dimensions. The region of interest is marked by color. The “pointer” is focussed upon the target point.
In this region a minimal skin incision was done and the skull bone was exposed by subperiostal preparation, showing an intact outer table without any clinical signs of osteolysis. On the skull surface the bone lesion was re-localized precisely as target point in alignment with multiplanar and 3D-CT data. The target point was then marked. A surgical drill was used to form a cylindrical resection path around the virtual planned lesion area and the outer and inner lamina of the skull containing the circumscribed lesion was then removed en bloc. The dura was exposed but remained intact (Figure 4). The wound was irrigated and closed. Due to the minimally invasive extent of bone resection there was no need for any reconstructive cranioplasty.
256
Figure 4. Surgical access during the image-guided procedure. LEFT: Intraoperative view after surgical removal of the bone tumor. RIGHT:Photograph showing the minimally invasive resected bone containing ossifying fibroma lesion with sufficient safety margins.
3.
Results
The procedure using image-guided surgical navigation was successful, respecting the fact that the region of interest could be detected within short time and the skull bone lesion was removed completely without causing extensive collateral damage. The tumor did not invade the dura mater or the periosteum. The intraoperative and postoperative courses were uneventful and the patient was ready for discharge 3 days after surgery. Histopathological examination of the excised bone tissue revealed irregular ribbon-like trabeculae lined by osteoblasts with no signs of malignancy and clear resection margins. Based on clinical, histological and radiological features of the excised bone lesion, the diagnosis of ossifying fibroma was established. There was no evidence of regional recurrence 20 months later.
4.
Discussion
Successhl treatment of ossifying fibroma requires complete removal of the bone lesion to avoid progression and to get histopathological proof of diagnosis. 2*4,57 ' I , I2 Conventional therapeutic strategy has been determined by wide area surgical excision because small lesions are difficult to localize. In order to avoid incomplete resection a substantial bone segment may need to be rem~ved.~.l 2 Extensive craniectomy potentially exposes the brain and possibly requires bone reconstruction, which in turn,poses limitations on postoperative
',
257 Attempts to operate minimally invasive in the settings of craniomaxillofacial surgery have proven more challenging.8*I6 It has recently become evident that surgical strategies based on tomographic image data (CT, MR) rather than projection radiography are essential to improve the surgical efficacy in complex anatomical situations.16' More recently image-based surgical navigation has been employed for minimally invasive procedures in surgery.16* The small size of many bone lesions makes these techniques particularly suitable for such minimally invasive biopsy or excision. This combined with precise localization by fusion with multimodal imaging data (CT, MR, SPECT, PET) may minimize the damage to surrounding tissue and reduce postoperative morbidity. Image guidance provides three-dimensional planning and localization using multimodality diagnostic image data. This is especially important in localizations where open access requires large exposure which is associated with difficulty in finding the bone lesion. The advantages of using navigated minimally invasive access €or the treatment of ossifying fibroma include a shorter hospitalization with rapid return to full social activity.
References 1. F. Alawi, Am J Clin Pathol. 118 Suppl, S50 (2002). 2. D. J. Beasley and F. E. LeJeune, Jr., J L a State Med Soc. 148,413 (1996). 3. N. J. Khoury, L. N. Naffaa, N. S. Shabb and M. C. Haddad, Eur Radiol. 12 Suppl3, S109 (2002). 4. H. K. Williams, C. Mangham and P. M. Speight, J Oral Pathol Med. 29, 13 (2000). 5. D. J. Commins, N. S. Tolley and C . A. Milford, J Laryngol Otol. 112,964 (1998). 6. J. Kuttenberger, N. Hardt and J. 0.Gebbers, Mund Kiefer Gesichtschir. 7 , 47 (2003). 7. R. C. O'Reilly, B. E. Hirsch and S. B. Kapadia, Am J Otolaryngol. 21, 131 (2000). 8. P. Grunert, K. Darabi, J. Espinosa and R. Filippi, Neurosurg Rev. 26, 73 (2003). 9. J. K. Han, P. H. Hwang and T. L. Smith, Curr Opin Otolaryngol Head NeckSurg. 11,33 (2003). 10. L. M. Specht and K. J. Koval, Bull Hosp Jt Dis. 60, 168 (2001). 11. J. Rinaggio, M. Land and D. B. Cleveland, JPediatr Surg. 38, 648 (2003). 12. S. Vlachou, G. Terzakis, G. Doundoulakis, C. Barbati and G. Papazoglou, JLaryngoI Orol. 115,654 (2001).
258
13. J. Schiffer, R. Gur, U. Nisim and L. Pollak, Surg Neurol. 47,23 1 (1997). 14. P. A. Winkler, W. Stummer, R. Linke, K. G. Krishnan and K. Tatsch, J Neurosurg. 93,53 (2000). 15. X. J. Yang, G. L. Hong, S. B. Su and S. Y . Yang, Chin J Traumatol. 6, 99 (2003). 16. J. Hoffmann, F. Dammann, D. Troitzsch, S. Muller, M. Zerfowski, D. Bartz and S. Reinert, Biomed Tech (Berl). 47 Suppl 1 Pt 2,728 (2002). 17. M. J. Troulis, P. Everett, E. B. Seldin, R. Kikinis and L. B. Kaban, Znt J Oral Maxillofac Surg. 31,349 (2002). 18. S. Hassfeld and J. Muhling, Int J Oral Maxillofac Surg. 30, 2 (2001).
Simulation and Modelling
This page intentionally left blank
REALISTIC HAPTIC INTERACTION FOR COMPUTER SIMULATION OF DENTAL SURGERY
A. PETERSIK~,B. PFLESSER~, u. TIEDE~,K.H. H O H N E ~M. , HEILAND~, H. HANDELS~ lnstitut f u r Medizinische lnformatik Klinik und Poliklinik f i r Zahn-, Mund-, Kiefer- und Gesichtschirurgie Universitatsklinikum Hamburg-Eppendorf Martinistx 52 20246 Hamburg, Germany E-mail: petersik @uke.uni-hamburg.de
In this paper we present a system for simulating apicoectomies. Defined reduction of bone with a drill without injuring therein lying structures is an essential part of surgical techniques, especially during dental surgery. A computer based simulator with haptic feedback could reduce costs and offers new training possibilities. However, due to extremely demanding computational requirements, haptic rendering of high detailed anatomic models together with interactive techniques for reduction of bone is still a big challenge. In the following, we portray new algorithms which, on the one hand, allow for realistic haptic rendering of even very small anatomic structures and on the other hand are capable of simulating the reduction of bone with realistic haptic sensations. As an application a system for simulating apicoectomies is shown.
1. Introduction In surgery simulation it is often not sufficient to deal only with a visual representation of the anatomical situation. For most applications the sense of touch is necessary for realistic interaction. Most of the published applications for surgery simulation deal with interventions that involve elastic deformations of soft tissue. The simulation of material removal from rigid objects (sawing, milling, drilling) as performed in dental surgery is a less developed field and has other requirements than simulation of soft tissue deformation. One important requirement is that the properties and the extent of the tool must be considered for collision detection and force calculation. Also the algorithms for graphic and haptic rendering must be able to realistically render the irregularly shaped cut surfaces, resulting from the combination of existing cavities and drilled holes. The requirements are fulfilled best by using a volume model. For realistic haptic rendering it has the decisive advantage over a polygonal representation that computation time for collision de261
262 tection is independent of the complexity of the scene. In contrast to that, the cost for a collision detection in a polygonal model is dependent on the number of triangles in the scene. Especially when drilling into the bone more and more triangles would be created which would increase the cost even further. The purpose of the work presented in this paper is to develop algorithms which can calculate the forces for haptic feedback with high resolution. Realistic feeling should be provided while touching the model as well as while actively modifying the model with virtual tools. The algorithms should be able to handle models of arbitrary complexity. The feasibility of the algorithms is demonstrated with a system for simulating apicoectomies.
2. Related work In order to achieve realistic haptic surface rendering, collisions between the tool and the objects in the scene must be detected and a collision-free position for the tool must be computed. Figure 1 shows at left a situation where the user moves the end-effector of the force-feedback device from position 1 to position 2 which could not have been reached in reality. Therefore the correct position 3 must be calculated by the haptic rendering algorithm and a force which pushes the haptic device from position 2 to position 3 must be applied. A simple approach to calculate position 3 is to look for the shortest distance between position 2 and the surface. Such an approach is called "penalty based"'. In figure 1 at right a situation is shown, where this approach does not work. Here, the movement of the tool over time must be considered. Therefore constraint-based approaches' were introduced. These approaches use an intermediate object (called god-object or proxy) which stores the last position where the tool was located on the object's surface. To find the new position 3 on the surface, the proxy object located at position 1 is moved along the surface to reduce the distance to position 2 until a local minimum is reached.
Figure 1 . Left: Principle of haptic rendering: The user pushes the stylus of the force-feedback device from its original position 1 to position 2. A force is calculated and applied to push the tool into the physically correct position 3. Right: The penalty based approach fails.
263 A problem of many approaches for haptic rendering is that only one point of the tool is considered to calculate collisions and tool positions. Thus the haptic feeling is not realistic in that the approaches does not prohibit unrealistic situations like entering a small hole with a large drill. Therefore collision detection approaches which consider the shape of the tool were developed This approach uses a binary voxel model which can handle scenes of arbitrary complexity. While parts of this approach could be applied directly to the area of medical simulation, the binary voxel model would lead to a low resolution when using medical data. In order to simulate material removal procedures like drilling or milling for surgical simulations realistically, it is necessary to develop:
’.
0
0
0
A precise tool representation which represents shape and extend of the tool. Collision-detection algorithms which are not limited to a binary voxel representation of the image objects but use grey values to determine object surfaces at sub-voxel resolution. An algorithm for the calculation of realistic drilling forces which allows realistic removal of bone.
Additionally, all haptic algorithms must run with an update rate of at least 1kHz.
3. Methods
3.1. Data representation We are working with models which are based on medical volume data from CT or MRI. The volume is represented by isotropic attributed voxels. The voxel attributes are grey values and membership to an organ. The membership to an organ is determined during the semi-automatic, threshold based segmentation process. Since the voxel-based representation does not contain an explicit representation of the object surfaces, the surfaces must be calculated during the rendering process. This is done by a ray-casting algorithm4 which renders iso-surfaces at sub-voxel resolution. Such calculations are very time-consuming and should be minimized to the extend absolutely necessary. On the other hand our sub-voxel approach leads to smooth and detailed surfaces which can be explored at any scale.
3.2. Haptic surface rendering
In the following we describe the methods for passive haptic interaction with a model, i.e. when the user is not removing material. For our purpose a collision
264
detection approach which considers the shape and the extent of the tool is needed. Therefore we choose the approach of McNeely3 as a starting point. However, we had to extend and modify this approach in several points. Instead of using a binary voxel model, we are using a voxel model containing density values of the original medical data, e.g. from CT. We changed the collision detection algorithm (see 3.2.2) so that the location of the object surfaces, which is calculated at subvoxel resolution (see 3.1), is used. This leads to a more precise calculation of force direction and surface location. Another great advantage of the sub-voxel resolution is that force discontinuities cannot appear when the tool crosses voxel boundaries under sliding motion. Thus force filtering or averaging is not needed which improves the haptic sensation of small details. In contrast to the approach of McNeely3 we are using a refined tool representation, which is described in the next section. 3.2.1. Tool representation
Tool
active point
P
- I
Figure 2. Tool representation by surface points and inward pointing normal vectors.
The tool is represented by a number of sample points Pi on the tool’s surface. The sample points are used for collision detection and therefore they should be distributed at preferably equal distances over the tool’s surface. This way collisions with every part of the tool are detected. The distance of two neighboring sample points Pi must be small enough so that small objects in the scene cannot pass through two points without a collision. Otherwise such small objects cannot be explored in a realistic manner. For computation speed-up each point has an associated normal vector .n’, which is pointing to the inside of the tool and is used by the collision detection algorithm to compute a vector which is pointing to a
265 collision free position as described in the next section. All vectors same length.
6 have the
3.2.2. Collision detection algorithm collision point 0
free point
Tool
Figure 3. Calculation of forces during passive interaction. The resulting force direction lated by adding the three vectors , fz and f3.
f; is calcu-
f;
In order to calculate the collision force direction, each surface point of the tool is checked, whether it is inside or outside the object. All surface points Pi which are inside the object (Fig. 3, filled dots) are traced along the inward pointing normal until the surface is found. The difference vector from P i to the surface gives the vector All found vectors are added and the direction of the sum vector is the direction of the force vector which must be applied to the haptic device. If using the magnitude of the sum vector f; as magnitude for the force, the force would be directly dependent on the number of vectors which is not correct and leads to force discontinuities. Also averaging by dividing the force sum by the number of vectors found leads to force discontinuities. Instead, the magnitude of the force must be proportional to the penetration depth. To find the penetration depth we are projecting each vector on the vector f;; (Fig. 3). We found that the magnitude of the resulting vector with the biggest magnitude is a good approximation of the penetration depth. The result is a translation vector which is pushing the tool to the surface of the object.
&
5.
5,
3.3. Volume interaction In contrast to passive interaction as described in section 3.2, volume modification not only includes collision detection but also volume modification and rendering
266
updates. Here the main problem is, that it is not possible to modify the volume with a frequency of at least lkHz, necessary for the haptic update. Instead the modification of the volume data runs asynchronously to the haptic process with about 20 Hz. This means that the force calculation during drilling can not rely on the volume data, where not all of the current drilling operations have been modelled. Therefore the algorithm of section 3.2, which relies on the exact determination of object surfaces, cannot be used. The basic idea behind our force calculation algorithm during drilling is, that the drill moves towards unmodified regions, while the portions behind the drill can be recalculated and redrawn asynchronously to force calculation. Instead of a collision detection with object surfaces we are using sample points in front of the drill to calculate the amount and distribution of the material to be removed (Fig. 4). The location of the sample points in front of the drill ensures that collision detection is only done for regions where no modification of the volume takes place. Drilling direction
oo
o
no collision
Figure 4. Calculation of forces during drilling. The resulting force is dependent on collisions with sample points around the drill.
To get a realistic force feedback during drilling, we apply as a first approximation a force which is pushing against the drilling direction. The force magnitude is proportional to the speed of the drill movement. The vector of this force is pointing from the current position of the tool PC to the last stored position PL. Additionally we consider the material distribution around the drill and the amount of removed material by looking for collisions at the sample points in front of the drill (Fig. 4). The balance point b of the mass removal in tool coordinates can be calculated by adding all colliding active points of the tool Pi and dividing the sum by the number c of colliding points found:
The direction of the drilling force vector Scan now be calculated as follows:
267
In figure 4 you can see one example for such a calculation: Material will be removed and the vector of the drilling force will be rotated about 45 degrees because only one quarter of the spherical drill is in contact with the material.
4. Application
inferior Figure 5. Model with hidden bone. The apical inflammations of teeth 2 3 , 2 5 , 36 and 35 and the left nervus alveolaris inferior are visible.
To develop a system for simulating dental surgery, the presented algorithms for haptic feedback were integrated in the VOXEL-MAN system5. The model of the skull was created based on CT-data. Structures like nervus alveolaris inferior and apical inflammations were added (see fig. 5 ) . As tools, different kinds of spherical drills can be used. We have implemented carbide drills and diamond drills of different sizes. The carbide drill removes bone more effectively and exhibits more vibrations, while the diamond drill is used for cautious bone removal with low vibrations. In order to get an adequate representation of the tool’s shape (as described in section 3.2.1) while reducing computation to a minimum, we are using 56 sample points on the tool’s surface. The progress of the surgery is shown in figure 6. Cut planes can be used for inspection of the region ’in depth’ at every stage during surgery.
268
We implemented the system on a PC with a hyperthreading enabled Pentium 4 processor and 1 GB RAM. As haptic device we are using a Phantom Desktop device (Sensable Technologies Inc.). Our system is running SuSE Linux 9.0. For stereoscopic viewing we are using an Nvidia Quadro4 graphics board in combination with shutter glasses. The haptic process runs at an update rate of 3 kHz while the graphical update rate is 20 Hz.
Figure 6. Step by step visualization of an apicoectomy of tooth 36. Upper left: Seeking the apical inflammations. Upper right: Removal of inflammation tissue. Lower left: Postoperative axial reconstruction of the resection. Lower right: pseudo-sagittal reconsbuction.
269 5. Results We have developed improved algorithms for haptic material removal interactions with the following features: 0
0
0
0
Congruent high visual and haptic realism is achieved by using an identical volume based surface representation for visual and haptic rendering at subvoxel resolution. Realistic tool-object interaction is achieved by a detailed tool representation with multi-point collision detection. The realistic haptic sensing of stiffness and drilling vibrations is achieved by tuning the algorithms to a haptic update rate of 3 kHz. The realistic haptic feedback during drilling is achieved by regarding the material distribution around the drill and simulating drilling vibrations.
6. Conclusions and future work In this paper we have presented an approach for realistic haptic material removal interactions. On its basis a system for simulating apicoectomies was developed. The first evaluation of 38 student questionnaires shows that offering virtual training possibilities to dental students is a valuable addition to existing teaching modalities. A further extension to other dental procedures like implantology is currently be investigated. Therefore the presented algorithms will be extended to be able to deal with more complex shapes and functions of the tool.
References 1. Massie, T.M., Salisbury, J.K.: The phantom haptic interface: A device for probing virtual objects. ASME Haptic Interfaces for Virtual Environment and Teleoperator Systems 1994 l(1994) 295-301 2. Zilles, C., Salisbury, K.: A constraint-based god object method for haptics display. Proceedings of IEEWRSJ (1995) 3. McNeely, W.A., Puterbaugh, K.D., Troy, J.J.: Six degree-of-freedom haptic rendering using voxel sampling. Computer Graphics (SIGGRAPH99 Proceedings) (1999) 401408 4. Tiede, U., Schiemann, T., Hohne, K.H.: High quality rendering of attributed volume data. In Ebert, D., Hagen, H., Rushmeier, H., eds.: Proc. IEEE Visualization '98, Research Triangle Park, NC (1998) 255-262 (ISBN 0-8186-9176-x). 5. Hohne, K.H., Pflesser, B., Pommert, A,, Riemer, M., Schiemann, T., Schubert, R., Tiede, U.: A new representation of knowledge concerning human anatomy and function. Nat. Med. 1 (1995) 506-51 1
ROBOTIC SIMULATION SYSTEM FOR PRECISE POSITIONING IN MEDICAL APPLICATIONS ECKHARD FREUND, FRANK HEINZE, JURGEN R O S S M A " Institute of Robotics Research, Otto-Hahn-Str.8, 44227 Dortmund, Germany, {freund, heinze, rossmann) @ i ~ d e
One of the unsolved problems in medical technology is to find a methodology for the precise positioning of patients. We propose an integrated system approach based on the 3D robotic workcell simulation software COSIMIRa'. The proposed system registers 2D x-ray images to a 3D model of a patient's region of interest, calculates a difference vector between the expected and the actual position, and visualizes the result. This paper provides first results based on a phantom study with synthetic and real x-ray images.
1. Introduction Precise positioning and repositioning of patients is still an unsolved problem in medical technology. It is especially relevant in radiation therapy (radiotherapy), where ionizing radiation destroys malignant cells. Today, radiotherapy consists of the following steps: At first the therapist obtains a computed tomography (CT) scan of the patient's region of interest. Then, he specifies a so-called isosurface, which defines the outer range of the target volume. The center of the target volume is called isocenter. A planning system uses this data to calculate the paths and settings of a photon beam linear accelerator (LINAC). In order to give healthy tissue the chance to recover, the therapy is usually fractionated into 30 to 50 sessions. The therapist has to position the patient before each single session. The process of positioning relies on the visual matching of lines and marks on the patient's skin with laser lines in the treatment room. As the marked skin is flexible and by nature not rigidly attached to the defined isosurface inside the body, the positioning process contains in itself large potential inaccuracies. Today, therapists have to deal with an estimated inaccuracy of 10 mm in all spatial directions. This means that in practice the isosurface has to be enlarged by 10 mm to make sure that the beam covers all parts of the tumor in each session. This enlarged volume negatively impacts the maximum dose that can actually be delivered to the tumor, because the sensitivity of the surrounding tissue limits the dose. Increased accuracy in positioning is a key requirement for increasing the delivered dose. And a higher dose is expected to significantly increase the percentage of healed patients [I].
270
271
We propose an integrated system approach based on the 3D simulation software COSIMIR@ (Cell Oriented Simulation of Industrial Robots) by transferring results from the simulation of robotic and automation systems in projective virtual reality [2] into a medical environment. COSIMIR@provides the central data processing unit, the visualization, and a simulation interface. It registers multiple 2D x-ray images to a 3D CT scan of the patient. Maintz and Viergever [3] give an overview of the state-of-the-art in medical image registration methods. Bansal [4] registers portal images to the 3D CT. Penney [5] uses single x-rays to register. Viola [6] introduced mutual information as an efficient similarity gauge for registration. Although there is a lot of literature on medical image registration, real clinical applications are still quite rare. One of the few examples of a research approach that made its way to clinical usage is Cyberknife [7]. Instead of the standard LINAC type of treatment equipment, Cyberknife uses a six degrees of freedom (DOF) robot arm. An approach that fits into the clinical routine and uses standard equipment is still missing. This paper starts with an analysis of the system requirements, followed by a detailed description of the system architecture including the basic building blocks of the system. The verification of the system approach takes place in a phantom study including a CT sequence of a human phantom skull, synthetic and real x-rays. A discussion of the results concludes this paper.
2. System Requirements Sensors currently available cannot measure the position of the treatment region directly. Nonetheless, it is possible to detect natural landmarks that are almost rigidly attached to a static treatment region. If it is possible to detect these landmarks with a high accuracy, it will be equally possible to deduce the position of the treatment region with a high accuracy. An easily detectable natural landmark is the patient’s skeleton, as it is well visible in CT scans and x-rays. For the treatment verification process, the therapist is looking for a system that is useful independently of the general location of the treatment region inside the body. The algorithms for registration have thus to be independent of special geometrical properties. The system should find the position alignment automatically and give a visual feedback to the therapist, such that he can easily verify the result. It should provide a difference vector between the current and the actual position. But it has to leave the final decision to the therapist, either to start the therapy session or to reposition the patient and start the verification process again. The system has to provide a very intuitive and easy-to-use interface. Otherwise therapists and medical personnel will not accept it in their daily working routine.
272 3.
System architecture
Our approach tries to solve the problem of treatment setup verification by providing an integrated system based on the 3D simulation software COSIMIR@. It is developed at the Institute of Robotics Research in Dortmund, Germany, and supports a wide range of different application areas, ranging from space robotics, via manufacturing automation to forest simulation. A 3D simulation system for setup verification supplies the therapist with a flexible and easy-to-use system that he can use for treatment simulation and treatment visualization, too. The operating sequence will be as follows: At first, the therapist makes a CT scan of the patient, which he feeds into the COSIMIR@system. COSIMIR@ extracts a 3D surface model and adds it to the model of the treatment room including the treatment couch and the LINAC. After the positioning of the patient on the treatment couch, two or more orthogonal x-ray images are taken and fed into the COSIMIR@ system. During the registration, COSIMIR@finds the position of the 3D skeleton surface relative to the isocenter. The registration process assumes a rigid transformation with three translational and three rotational degrees of freedom. COSIMIR@ calculates digitally reconstructed radiographs (DRRs) using a ray-casting approach. The system matches the calculated DRRs with the real x-ray images and derives a similarity measure. Currently, the user has the choice between normalized cross correlation and mutual information as similarity measures. An optimization process maximizes the similarity measure either with a simple line optimization along the 6 coordinate axes or Powell's optimization. After the registration, COSIMIR@ provides a difference vector between the actual and the desired position of the patient. The system visualizes the actual and the desired positions and returns the similarity measure for the actual position. It is up to the therapist to decide if the actual position and its positional error are acceptable, or if the patient should be repositioned according to the results of COSIMIR@.
3.1. DRR calculation The overall performance of the system strongly depends on the DRR calculation. Currently, the system uses a ray-casting algorithm to calculate DRRs. It calculates the value of each of the DRR pixels by adding the Hounsfield values of the CT volume in equal steps along the ray from the focal point of the x-ray emitter to the position of the x-ray pixels. For the calculation of the Hounsfield value at one discrete point on the ray inside the CT volume the user can choose between a closest point scheme and a linear, bilinear, or trilinear interpolation. Table 1 gives runtime measures for a typical DRR calculation with different sizes on a PC with an Intel Pentium IV 2 GHz processor and 512 MB RAM.
273 Size 64x64 128x128 256x256 512x512
Table 1. Runtime of DRR calculation a1 orithm. Closest oint Linear Bilinear Trilinear 0.50 s 0.55 s 0.64 s 0.83 s 1.97 s 2.17 s 2.55 s 3.30s 1.61 s 8.48s 10.14s 13.05 s 21.95 s 31.30s 39.89s 51.61 s
3.2. Similarity measures For 2D-3D image registration the similarity measures mutual information (MI) and normalized cross-correlation (NCC) usually provide reasonably good results. Let (u,v) be the pixel position for a pixel at column u and row v. Let I,,R~(u,v) and Ix(u,v) be thejixel intensity in the DRR and in the x-ray image. The average intensities are 1, and I,, . Then, the formula for the NCC is:
(u,v)
Mutual information (MI) is a measure of the quantity of information that one entity supplies for another entity and vice versa. The discrete case makes use of histograms and classifies the pixel intensities into a number of different bins. The 1D histogram for the DRR and for the x-ray image yield the probability distributions p(x) and p(y), and a 2D histogram yields the joint probability distribution p(x,y) of the two images. In the experiments, the calculation uses bin sizes of 64 for the 1D histogram and 64 x 64 for the 2D histogram. The formula for the mutual information is then:
3.3. Optimization in preferred directions The optimization uses at least two x-rays from orthogonal viewing directions. It starts to optimize the position of the CT volume for one viewing direction. When the improvements of the similarity measures are smaller than a predefined threshold, the process starts to optimize from the next viewing direction, using the first convergence point as the start position for the next optimization. After the optimization has finished for all viewing directions, the system reduces the threshold and starts the next optimization cycle.
274
The line optimization optimizes the similarity measure along the 6 coordinate axes one at a time. But the sensitivity of the DRR to rotations and translations depends on the direction of the transformation. A movement of the CT volume perpendicular to the rays from the x-ray focus to the x-ray screen has a higher impact on the DRR than a movement parallel to the rays. The sensitivity of perpendicular versus parallel translations can easily reach an order of magnitude. The same holds true for rotations. A rotation about an axis parallel to the rays has a high impact. The other two rotations have only a small impact on the DRR. We call the directions with a high impact on the DRR preferred directions and propose an optimization that uses mainly preferred directions. The optimization scheme places for each x-ray source the z-axis parallel to the x-ray from the focus of the x-ray emitter to the center of the x-ray screen. Translations along the x- and y-axis and the rotation about the z-axis are preferred directions. Two orthogonal x-ray sources have only two preferred rotational directions. Thus, the system needs three orthogonal x-ray sources for a pure optimization in preferred directions. If it has only two x-ray sources it uses all three rotational directions, and has to cope with the low sensitivity of one of the rotational axes. Chapter 4 gives experimental simulation results for both configurations.
4.
Phantom study
A first evaluation of the capabilities of the system takes place with the help of a phantom study. The phantom for the study is a human skull. The CT scan of the skull phantom has a pixel size of 0.98 x 0.98 mm and a slice thickness of 2 mm. COSIMIR@generates a 3D surface model of the CT scan and positions the skull at a start position on the treatment couch inside its model of the treatment room. The first part of the phantom study uses synthetic images to show the validity of the optimization process and the second part uses real orthogonal x-ray pairs.
4.1. Synthetic images The synthetic image study ignores any discrepancies between calculated DRRs and real x-ray images. Instead, it registers two or three DRRs taken at a predefined target position of the CT volume to the DRRs generated during the optimization process. The aim is to compare the known transformation with the calculated transformation. At the start position, the isocenter lies exactly in the middle of the CT volume. Three x-ray sources The left side of Figure 1 shows the model of the treatment room with a LINAC, three x-rays, and the patient. The head of the patient model includes the CT data
275 of the skull phantom. After translating the CT volume by 5 mm / 10 mm in all 3 directions and rotating it by 5" / 10" along all of the 3 axes of a coordinate system located at the isocenter, the system generates a set of three orthogonal DRRs. Beginning at the start position, the optimization tries to recover the target position of the CT volume. It uses line optimization purely in the three preferred directions. Table 2 shows the differences between the known and the calculated target pose using NCC and MI as similarity measures after 4 optimization cycles. IUI NCC, 5 mm, 5 MI, 5 mm, 5 NCC, 10 mm, 10 MI, 10 mm, 10 ' O
0.1 mm 0.0 mm
0.2 mm 0.3 mm
IAYI 0.3 mm 0.0 mm 2.3 mm 0.5 mm
IAZI 0.4mm 0.0 mm 0.7 mm 0.1 mm
IARolll 0.1 ' 0.0 O
0.1 0.1
(APitchI 0.3 0.0' 1.7 0.4'
(AYaw( 0.3 0.0 0.2 0.1
The results demonstrate the principal ability of a configuration with 3 x-ray sources to solve for all 6 DOFs. For a transformation of 5 mm and 5", the optimization shows a fast convergence towards the target position for both similarity measures. For a transformation of 10 mm and 10" the convergence is still very fast for MI, but it is not as good for NCC. After 4 optimization cycles, there is still a significant offset in the Y direction and for the Pitch orientation. The experiment indicates that a reasonable convergence for the selected configuration and optimization scheme takes more than 4 optimization cycles.
Figure 1. Configurations with 3 x-ray emitters (left) and 2 x-ray emitters (right).
Two x-ray sources The right side of Figure 1 shows the same setup with only two x-ray emitters and receivers to generate a pair of orthogonal DRRs. As it is in this case not possible
276
to recover all 6 DOFs by optimizing only in the preferred directions, the algorithm uses preferred directions only for the translations. For the rotations, it uses all 3 available rotational DOFs. Table 3 gives an overview of the results using NCC and MI as similarity measures and 4 optimization cycles. Contrary to our expectation, the results did not show a higher inaccuracy of the orientation compared to the experiment with three x-ray sources. For NCC, the algorithm converges faster, for MI it converges slower than in the former case.
NCC, 5 mm, 5 MI, 5 mm, 5 NCC, 10 mm, 10 MI, 10 mm, 10 O
O
IAXI 0.1 mm 0.0 mm 0.0 mm 0.2 mm
IAYI 0.0 mm 0.1 m m 0.0 mm
1.5 mm
I U l 0.1 mm 0.1 mm 0.1 mm 0.1 mm
IARolll 0.0
O
0.0" 0.0" 0.0"
(APPitchl
(AYaw(
0.0 O 0.1 O 0.0"
0.0 0.0 0.0 0.0
0.8
O
4.2. Real x-ray images In the second part of the study, COSIMIR@uses orthogonal pairs of real x-rays. Figure 2 shows from the right to the left a real x-ray, the corresponding DRR and an inverted difference image of x-ray and DRR. The x-ray contains several types of artifacts like lines, letters, and numbers. The optimization uses the x-rays without compensating for these artifacts. The resulting DRRs have a close visual similarity to the real x-rays documented by the difference image.
Figure 2. X-ray, DRR and inverted difference image for one x-ray source.
5. Discussion and conclusion The discussed system approach offers a new way to determine the exact position of tumors in order to improve and to support the development of new treatment strategies in radiation therapy. The central building block of the implemented system is an enhancement of the 3D robot simulation system COSIMIR@that enables it to combine rather robot specific simulation features with new functionalities regarding the registration of x-rays to a patient's CT scan. As the
277 system provides visual feedback of the final registration to the therapist, he can make the final decision based on a full 3D view of the patient and the treatment machinery. The registration results based on synthetic images suggest that two xray images contain sufficient information to solve the positioning problem for all 6 DOFs. First results based on real x-rays show a good visual match between the DRRs at the calculated registration position and the given x-ray images. Compared to other approaches, our system is independent of the patient's region of interest, uses more than one x-ray image to improve the registration in all six degrees of freedom, and is integrated into a simulation system. Currently, the system is undergoing last software tests. Meanwhile, our cooperation partners at the Clinic and Polyclinic for Radiotherapy at the University of Essen, Germany, are setting up the required hardware to test the approach in clinical practice. Acknowledgments We thank Prof. Dr. med. M. Stuschke and the members of the Clinic and Polyclinic for Radiotherapy at the University of Essen for the CT sequence and the corresponding X-Ray images. We thank Varian Medical Systems Deutschland GmbH for the CAD data of the LINAC. References 1. M. Stuschke and H. D. Thames, Fractionation sensitivities and dose-control relations of head and neck carcinomas: analysis of the randomized hyperfractionation trials, Radiotherapy and Oncology 51, 113-121 (1 999). 2. E. Freund and J . Rossmann, Projective virtual reality: Bridging the gap between virtual reality and robotics, IEEE Trans. Robotics and Automation; Special Section on Virtual Reality in Robotics and Automation, Vol. 15, NO. 3,411-422 (1999). 3. J. Maintz and M. Viergever, A survey of medical image registration, Medical Image Analysis, Vol. 2, No. 1, 1-36 (1998). 4. R. Bansal, Information Theoretic Integrated Segmentation and Registration of Dual 2 0 Portal Images and 3 0 CT Images, Ph.D. thesis (Yale University, 2000). 5. G. P. Penney, Registration of Tomographic Images to X-ray Projections for Use in Image Guided Interventions, Ph.D. thesis (University of London, 1999). 6. P. A. Viola, Alignment by Maximization of Mutual Information, Ph.D. thesis (MIT, MA, 1995). 7. A. Schweikard, R.Z. Tombropoulos, L.E. Kavraki, J.R. Adler and J. C. Latombe, Treatment Planning for a Radiosurgical System with General Kinematics, IEEE Int. Con$ Robotics and Automation, 1720-1727 (1994)
COMPUTER-AIDED SUTURING IN LAPAROSCOPIC SURGERY
F. NAGEOTTE* , M. D E MATHELIN* , C. DOIGNON* L. S O L E R ~ J. , LEROY+ AND J. MARESCAUX~
* LSIIT (UMR 7'005 CNRS), Louis Pasteur University of Strasbourg, France E-mail:
[email protected] Institut de Recherche sur les Cancers de 1'Appareil Digestif (IRCAD), Strasbourg, France This work addresses the problem of the stitching task in laparoscopic surgery using a circular needle and a usual 4 DOFs needle-holder. T h e presentation focuses on the kinematics analysis of the stitching task and t h e use of the informations deduced from it in order t o help the surgeons. One of the objectives is t o minimize the deformation of the tissues during the task. Based on an analysis of the kinematics and geometrical constraints of the task, we present a method t o find if accept.able needle trajectories exist and t o plan such trajectories either for manual or robot-assisted stitching.
1. Introduction
Suturing is probably one of the most difficult tasks for surgeons in laparoscopic surgery. It can be decomposed in two stages : stitching, ie. the motion that makes the needle go through tissues, and knot tying. Most stitching tasks involved in surgery can be described as follows. The surgeon first selects an entry point and an exit point on the surface of the tissues, on each side of the lesion to be sutured. The surgeon drives the tip of the needle to the entry point, pierces the tissue, drives the tip of the needle to the exit point, and then pierces the tissue a second time. The needle can then be released, caught again close to the tip and pulled out of the tissue. Reaching the exit point may be a difficult sub-task because it must be realized without direct visual feedback. In laparoscopic surgery the task becomes even more difficult because of the kinematics and visual constraints induced by the laparoscopic setup. Moreover, the monocular vision provided by the endoscopic camera (see figure 1) makes the estimation of distances and angles very difficult. Consequently the surgeon cannot simply figure out which motions of the needle are feasible and which ones are not. When suturing in
278
279 laparoscopic surgery the surgeon must often proceed by trial and in many cases, large and undesirable movements and deformations of tissues cannot be avoided. Even with new robotic systems that provide instruments with supplementary degrees of freedom the needle driving task is difficult. \lost of previous works on compritc'raided siituring has focused on kiiw rying. Kang et ~ 1 have . ~sti~dicdthe forces applictl during knot tjing in training conditions. LcDuc et nl.' liavc developped a laparoscopic training environ- Figure Endoscopic view of circument but they do not have focused on lar needle and a needle holder in the stitching stage. The work presented ViVO here is a first step toward a computer-aided suturing system. Our objective is to provide useful real-time informations to the surgeons in order to help them before and during stitching. The following of the paper is divided into four sections. In the first section we present a geometrical and kinematics model of the stitching task with a usual 4DOFs instrument. Then, in the second section, we show the information that can be deduced from the previous analysis. The third section presents a path planning method for the needle, that allows stitching with minimal tissue displacement. Discussions about current and future work can be found a t the end of the paper.
2. Analysis of the stitching task with a usual needle-holder
and a circular needle 2.1. Description of the laparoscopic setup
In laparoscopic surgery small incisions are made into the abdomen of the patient. Trocars are then introduced into these incisions to allow surgical instruments and endoscopes to go through (see figure 2). Because of these trocars, the mobility of the instruments is reduced. Our paper deals with conventional needle holders. They are equipped with a claw rigidly linked to the cylindrical body of the instrument. This instrument has only 4DOFs : two rotations around the trocar center &, one rotation of the instrument around its axis, and a translation along its axis. The geometrical study of the stitching task has been made under simple and usual conditions. The movements of the tissues to be sutured due to respiration or heart beating will not be considered. Tissues to be sutured
280
Pr
Figure 2. Left : outline of the laparoscopic setup. Right : stitching task with desired entry point (I*) and exit point (O*)
are thin and can be locally approximated by a planar surface around the suturing area that will be called hereafter the stitching plane. Tissues are allowed to deform under the action of the needle and the needle-holder. However, deformations normal to the stitching plane will not be considered and we are only interested in the study of movements of the tissue in the stitching plane. During stitching, the needle is rigidly held by the needle holder. Moreover the needle used for this study is circular and no obstacles are present in the workspace. 2.2. Modeling of the stitching task
The stitching task is defined as the movement of the needle that allows to go through the tissue to be sutured from an entry point I* and to an exit point O* on the other side of the tissue. This task can be modeled as a path planning problem between two reference points given by the surgeon. These two reference points could be selected on the endoscopic image using a laser pointing instrument like the one developed in Strasbourg (c,f. Krupa et a p ) . Hereafter we will call initial position the position of the needle when its tip is in I* and final position the position of the needle when its tip is in 0'. The ideal movement would be to reach O* without deforming or tearing the tissues. It is also desirable that initial and final positions of the needle allow to pierce the tissue, i.e., that the orientation of the tip of the needle is close to the normal to the tissue. In the remaining part of the paper, we are interested in the conditions that allow to minimize tissue deformation in the stitching plane while reaching O* from I * . When the needle is handled by the needle-holder, any point N of the
281 needle can reach any position M of the abdomen of the patient under different orientations. Indeed the possible movements when N is on M , are an axial rotation around Q M where Q is the center of the trocar. In the following we will call the point M as defined previously, a fixation point for N (see figure 3). abdominal wall needle
\
\
organ tissue
T
Figure 3. Possible positions of the handling point H and the tip of the needle T when the point N of the needle is in I* : I* is a fixation point for N. Right : the circles of possible positions for T and H seen from Q
The pose of the needle in the needle-holder can be defined using four parameters (see figure 4): the angle p, that is to say the angular position of the claw on the needle, angles and $ and b the position of the needle in the claw. The last one does not actually act on the stitching trajectories and will not be considered hereafter. needle
H face view
t o p view
side view
b parameter
Figure 4. Parameters for the needle handling
282 3. Problems that can be solved with the stitching model
Using this model of the stitching several problems can be simply solved.
A/ Finding initial and final positions : Given an entry point, a desired distance d l o from I* to 0*,and needle handling parameters (p,<,$)we can determine the possible exit points 0' for the tip of the needle. Of course, if the desired distance is greater than the diameter of the needle no exit point are possible without deforming the tissue. When d I o is smaller than the diameter we can compute the point on the needle N , that is in I* when the tip is in O*. This point is defined by the angular position a
where r , is the radius of the needle. Then, I* is a fixation point for the point N,. Consequently, the tip of the needle, T , can move along a circle centered on axis &I*. Depending on the position and orientation of this circle with respect to the stitching plane there exist two solutions or none for the exit point. However these solutions are not necessarily physically possible. We must also check that in the corresponding position the needle goes under the surface of the tissue, ie. that the handling point H is above this surface. In a similar manner we can also compute the possible entry points given a specified =t_. exit point. Here, O* is a fix- m ation point for the tip of the e needle and we can deduce possible final positions. We must u. compute the intersection between the needle in the possible final positions and the s stitching plane. Depending on these positions there can be Figure 5. Curve of the possible entry points on the stitching plane with given O* and I * . one or no intersection (see fig- I' is the closest point of the curve to I* ure 5). This information can be used by the surgeon to check if the exit reference point can be reasonably reached from the entry reference point. If the curve for possible entry points goes close to the desired point I * , the situation will probably be acceptable. Otherwise it will be necessary to modify the Y
.
-1
,
D
,
2
.
1
,
B
,
30
12
1.
1e
283 handling of the needle. The distance from I * to the closest point of the curve, 1', is an estimation of the minimum required deformation of the tissue.
B/ Computing piercing angles : In the same manner, it is possible to compute all the possible initial positions of the needle for a given entry point I * . Then, I* is a fixation point for the tip of the needle. For these possible initial positions the piercing angle can be simply computed by piercingangle = arccos (n.t) where n is the normal vector to the stitching plane at the entry point and t the tangent to the tip of the needle. In this manner, it is possible to optimize the piercing angle for given needle handling parameters and trocar position.
C/ Computing needle handling parameters : For a given pose of the needle in the needle-holder the piercing angles or the required deformation may not be acceptable. Therefore the handling of the needle has to be modified. It is interesting to know how the needle should be handled to allow a final position without deformation. In such a final position the tip of the needle is in O* and N , is in I*. The only possible movement for the needle is a rotation around the axis I*O'. We call y the angle between the plane of the needle in the final position, and the plane normal to the surface of the tissue going through I*and O*. If the position of the grasping ,B is fixed, for each value of y there exists a unique couple of grasping angles {(, $}. The optimal needle handling $} is then a compromise solution between y close to zero and a good piercing angle in I*.
{c,
4. Path planning for the needle Once a needle handling pose has been chosen and acceptable initial and final positions found, we must know if there exists a feasible path between these two positions. In this section we present a method to solve the path existence problem and to find an acceptable path with respect to the deformation of the tissue. Let suppose first that the handling of the needle allows to find a final position for which the needle goes through the reference entry point I*. In this case, one can hope to find a path that does not deform the tissue. Then we impose that the entry point stays in I*during the whole trajectory. Under these conditions, the configuration of the system can be expressed in a configuration space of dimension two with state variables 'p and p where
cp is the angular position of the point N of the needle that is in I * , and p the angular position of the tip of the needle on its possible circle (see figure
3). When I’ is different from I * , we enforce the position of the entry point along the path. For example we can impose that I follows a straight line from I* to 1’. The configuration space is discretized by sampling cp and p. With only two degrees of freedom, the computation of a large number of configurations has a moderate cost. With a small enough sampling, a straight line in the configuration space is an acceptable local path to connect two close configurations. The sampled cp and p form a 2D map. For every position in this map we compute the cost of the corresponding configuration. Since we wish to avoid the tip of the needle to be close to the surface of the tissue, the cost associated with a configuration is smaller when the tip of the needle is far from the stitching plane. So as to limit the computation time of the optimal path, the possible neigbours of a position {cpn,pi} in the map are restricted to the closest configurations for which cp = cpn+l, and the configurations {qn,p i + l } and {cpn, p i - 1 ) . Based on this oriented graph, the optimal path is computed using the algorithm proposed by Dijkstra’. The cost associated with a path in the graph is computed as the sum of the configuration costs and the distance costs between two successive configurations. Figure 7 shows the results of a simulation. The corresponding temporal evolution of the state of the needle-holder is presented on figure 6. The resulting motion of the path generation is satisfying, while the problem was not simple due to the torsion between the initial and the final position. This path can be used manually as well as in robot-assisted suturing (automatic mode).
Figure 6. Temporal evolution of the state of the needle holder. & and OY are the angles around the trocar, Bz the angle of the rotation on the axis of the needle-holder and d, the depth of the instrument in the abdomen
Figure 7. Some computed positions of the needle along the generated path. The path was computed using sixty values for cp and hundred values for p. From top to bottom and left to right are the initial position, positions at steps 3, 9, 24, 33, 42, 51, 57 and the final position given under two points of view. The stitching plane is the horizontal line.
5. Current and future work
Many different directions could be explored in future work. We are currently working on the extension of the theoretical analysis of the movements to needle-holders with 5DOFs. Furthermore, the way to pull the needle out of the tissue, could also be studied. On the practical side, our work uses a device previously developped by Krupa et aP. We plan also to get required informations by using a stereo vision system. We are currently implementing the path planning software on an Aesop surgical arm from Computer Motion.
References E. W. Dijkstra. A note on two problems in connection with graphs. Numerische Mathematic, 1, 1959.
H. Kang and J. Wen. Robotic knot tying in minimally invasive surgeries. In IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 2002. A. Krupa, J. Gangloff, M. de Mathelin, G. Morel, J. Leroy, L. Soler, and J. Marescaux. Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing. IEEE Transactions on Robotics and Automation, 19(5):842-853, October 2003. M. LeDuc, S. Payandeh, and J. Dill. Toward modeling of a suturing task. In Graphics Interface, 2003.
EXPERIMENTAL VALIDATION OF A FORCE PREDICTION ALGORITHM FOR ROBOT ASSISTED BONE-MILLING CHRISTOPHER PLASKOS Laboratoire TIMC-IMAG, Groupe GMCAO, Faculti de Midecine de Grenoble, Universiti Joseph Fourier, Institut de I'lnginierie et de I'lnformation de S a d , La Tronche, 38706, France ANTONY J. HODGSON Neuromotor Control Laboratory, Department of Mechanical Engineering, University of British Columbia, 2324 Main Mall, Vancouver, B. C. V6T 124, Canada PHILIPPE CINQUIN Laboratoire TIMC-IMAG, Facultk de Midecine de Grenoble, Universitd Joseph Fourier, Institut de l'lnginierie et de I'lnformation de Santi, La Tronche, 38706, France
The objectives of this study were 1) to measure instantaneous milling force components in bovine cortical bone as a function of cutter rotation for various tool geometries, cutting speeds, feed rates and feeds per tooth, radial and axial cutting depths, and cutting directions and modes, and 2) to evaluate a milling model that uses orthogonal cutting data to predict the instantaneous milling force components as a function of the above parameters. Experimental results demonstrate that milling force components are periodic signals with their curve shape, mean, and amplitude changing dramatically with the milling tool geometry and kinematics. Simulated and measured instantaneous milling forces are presented for the illustrative cases of full and half immersion down milling with two and four fluted cutting tools. Average slot milling forces at various feeds per tooth are also presented for two different cutting velocities. The model predictions are in good agreement with the measurements.
1.
Introduction
We have been working to build a generalised model of the bone milling process in order to gain a better understanding of how instantaneous milling force components vary, both in magnitude and in proportion, as a function of the cutting conditions. Comprehension of these force relationships and their dependence on the machining parameters, such as milling speed, feed rate, radial and axial cutting depth, and cutter geometry, is essential for analysing stability and improving safety, as well as optimising cutting accuracy and time. Although bone milling is commonly performed in many orthopaedic and neurosurgery procedures, the process is often regarded as being somewhat unstable and difficult to control. Forces can increase suddenly and 286
287 unexpectedly, causing the tool to slip from the planned trajectory and compromising surgical safety and accuracy. Despite these difficulties, we could onIy find a small number of detailed studies in the literature that quantify milling force components and trends as a function of several relevant parameters [l, 21. Krause et al. [l] reported on feed forces for milling bovine bone with high-speed burrs operating at various feed rates and cutting depths. Although the precise rotational velocity during cutting was not measured, they did find that average peak feed forces generally increased with cutting depth and feed rate, and decreased with cutting speed. For equivalent depths of cut, however, the effect of tool geometry on cutting force was dependent upon the rotational speed. Denis et al. [2] measured milling forces in the feed and normal directions for human proximal tibia1 bone as a function of milling speed and feed per tooth. Average peak forces in either direction increased with feed per tooth, though the rate of increase was different for each force component and cutting speed. Moreover, they found that the influence of speed was not constant across specimens or component directions. Clearly, a mathematical model of the cutting process would be an invaluable tool to aid in the analysis of milling. We have reported previously [3] that a non-linear model of the cutting process provides reasonable agreement with the maximum feed force values published in ref [l], though unfortunately we could not find more detailed force component measurements in the literature with which we could evaluate our model. The objectives of this study were therefore (1) to measure instantaneous milling force components in bovine cortical bone as a function of the cutter rotation for various tool geometries, cutting speeds, feed rates and feeds per tooth, radial and axial cutting depths, and cutting directions and modes, and (2) to evaluate a milling model that uses orthogonal cutting data to predict the instantaneous milling force components as a function of the above parameters. 2.
Experimental Methods
We conducted a series of bone milling experiments on a horizontal machining center (Mori Seiki SH-403, Japan) and measured force components in the feed and normal directions with a dynamometer (Kistler Instruments, type 9255B, Switzerland). The piezoelectric force signals were passed through charge amplifiers and into a PC based data acquisition system, configured to sample at a rate equivalent to 1" of tool rotation. Measurement noise was reduced by averaging the data at every individual 1" point over several revolutions. Tool 'runout' was also reduced by taking the average of the first and second half of the rotation cycle.
288
Cortical bone specimens were dissected from the mid-diaphysis of fresh bovine femurs with a diamond blade band saw under a constant supply of imgation, and then milled into slices of constant thickness (-5-1Omm) on a milling machine. For the milling tests, the bone specimens were clamped so that the milling tool end projected from the far end of the slice, thereby avoiding tool tip end effects (i.e. face milling) and allowing the generated bone chips to escape freely from the cutting zone and not influence the measured cutting forces. The bone specimens were kept wet with physiologic saline solution throughout the experiments, though no irrigation was directly applied to the cutting zone during the force measurement phase. The ranges of experimental conditions tested are summarized in Table 1. The feed per tooth, c, is the linear distance the cutter travels between each successive tooth pass (i.e. c =f / (w*n) [mm], wherefis the feed rate [mdmin], w is the angular velocity [RPM] and n is the number of cutting teeth). Milling was performed with the cutting tool fully and partially (half) immersed into the specimen to test the influence of radial cutting depth. To evaluate the influence of bone anisotropy on cutting forces, milling was carried out with the feed direction both parallel to and transverse to the long axis of the bone. Four different cutting tool geometries were used in total (Table 2.) Table 1. Range of experimental conditions tested. Feed per tooth, c [mm] Cutting velocity, w [RPM] Radial depth of cut, r [mm] Axial depth of cut, u [mm] Milling mode
0.0005-0.350 40-5,000 2*R, R (full and half immersion) -5- 1O m Up and down milling
Table 2. Tool geometries used, as specified by the manufacturer. Rake, a ["I Helix, h ["I
No. of flutes, n Material Carbide * HSS = High Speed Steel
3.
B 10 30 10 3.175 2 HSS*
3.175
9.525
Milling Model Formulation
The basis of our modelling approach has been described previously [3]. In short, the model considers the instantaneous kinematics of the milling tool and the specific cutting energy or pressure of cortical bone as determined from
289
orthogonal cutting tests. The specific cutting energy is a measure of the energy required to remove a unit volume of bone by cutting, and it is dependent on several variables, including the geometry of the cutting edge, the thickness of the chip produced, and the orientation of the bone specimen [4]. Each cutting edge of the milling tool is partitioned axially into a series of small elements, and for any cutter orientation within a tool revolution cycle (i.e. 6’ = 0,1,2, ...,360’) the tooth elements that are actively engaged in cutting are identified. For a particular set of cutting conditions (f, o, a, r, n) the instantaneous thickness of the cutting chip removed by each edge element is determined. When a tooth is engaged in cutting, the chip thickness is simply:
t(8)= c sin(8)
(1)
where c is the feed per tooth [mm] and 6 is the cutting edge orientation measured from the y axis (i.e the line perpendicular to the feed velocity). The corresponding tangential and radial force components acting on the tooth element can then be approximated by:
F
= Kt”Aa
(2)
where K and n are calculated from the orthogonal force data and are dependent on the tool geometry and cutting direction, and Aa is the axial width of the element. These elemental force components are then transformed into the fixed milling coordinate system (x, y) and summed along each cutting tooth to obtain the instantaneous milling force components in x and y. 4.
Results
The measured milling force components were periodic signals with the curve shape, mean, and amplitude changing dramatically with the milling tool geometry and kinematics. Figure 1 illustrates how the instantaneous milling force components in the feed (x) and normal (y) directions vary as a function of the tool rotation angle (spindle rotation, 0) while slot milling with a double fluted helical cutting tool at three different feeds per tooth (c = 2, 50, 100pm; the curves with the higher forces peaks represent the higher feed data). The feed direction is parallel (left plot) and transverse (right plot) to the axis of the bone. Note that although the feed per tooth is doubled from c = 50 to IOOpm, the peak forces in both x and y are not doubled correspondingly, which is in agreement with the simulations. It can also be observed that the peak forces in the normal (y) direction are slightly higher when milling parallel to the bone axis. This is a result of the increase in specific cutting energy when the edge
290 elements are cutting transverse to the predominant fibre direction (i.e. at 8 = 90" when milling parallel to the fibres). Figure 2 demonstrates how the instantaneous milling force components in the feed (x) and normal (y) directions vary during half-immersion down-milling, for a cutting tool having two helical teeth (upper plot series) and with one havingfiur helical teeth (lower plot series). With the two fluted cutter, there are large variations in each cutting force component with time, while this variation is relatively small for the four toothed tool. This is a result of the different cut geometry, since at various stages in the rotation cycle there are a different number of teeth engaged in the cutting zone for each case.
Figure 1. Experimental and simulated instantaneous milling force components in x and y for slot (full-immersion) milling with a cutter having two helical teeth (Tool 'B'). Feed direction is parallel (left plot) and transverse (right plot) to the bone axis. Cutting conditions: feeds per tooth c = 2, 50, 100 pm; rotational velocity w = 100 RF'M, axial depth of cut a = 9.lmm; Orthogonal cutting data from [4].
291
For both half immersion cases, it is observed that the feed force (Fx) shifts from negative to positive with increasing feed per tooth. This is a result of the proportional difference between the tangential and the radial force components acting on each tooth element at different chip thicknesses. Comparing full and half immersion milling with the two fluted tool, it is apparent that the feed force direction also depends on the radial cutting depth. In surgery, these forces would tend to pull or push the cutting tool in opposite directions along the feed path, depending on the radial cutting depth and feed rate applied.
Figure 2. Experimental and simulated milling forces in x and y for half-immersion down-milling with a two fluted (upper plot series; Tool ‘A’) and four fluted (lower series; Tool ‘C’) helical tools. Cutting conditions: feed ratesf= 4, 52, 100, 200 d m i n ; rotational velocity o = 1000 RPM; axial depth of cut a = 9.6mm for Tool ‘A’ and 8.8mm for Tool ‘C’. Orthogonal data from reference [5].
292 Figure 3 illustrates how the average magnitudes of the measured and simulated milling force components in the x and y direction increase with feed rate for slot milling at 100 RPM and 5000 RPM. For equivalent feed rates (i.e. left column), the 100 RPM force data are generally -5 times greater than the 5000 RPM data in the x direction, and -20 times greater in they direction. The difference in the force magnitude between the two speeds is largely due to the difference in the chip thickness (i.e. the feed per tooth, c), whilst the difference in the force component proportions between x and y at each speed is due to the different rates of change between the tangential and radial force components as a function of the chip thickness. At equivalent feeds per tooth (right column), the measured force magnitudes at each cutting speed are comparable, though the rate of increase in force with c is -3 times higher in the x direction for the 5000 RF’M data. This difference is caused by the effect of cutting velocity on the tangential and radial force components. -
500 -
50
400 -
40 -
300 -
30 -
100 RPM
200 -
-B c
e
0 IL
100 -
o 5 (25)
10 (50)
20 (100)
35 (175)
70 (350)
al 5000 RPM
5 (05)
10 20 35 (1) (2) (35) Feed rate [rnrnlrnin] (Feed per tooth [prn])
70 (7)
05 (5)
1 2 35 (10) (20) (35) Feed per tooth [prn] (Feed rate [mmimin])
65 (65)
Figure 3. Measured (Exp.) and simulated (Sim.) mean cutting force magnitudes in the x (feed) and y directions for slot milling with cutter ‘A’ at 100 RPM and 5000 RF’M, for equivalentfeed rates (left column), and equivalent feeds per tooth (right column). The axial depth of cut was a = 9.6mm. Orthogonal data from reference [4].
293 5. Discussion Cortical bone milling forces are dependent on several parameters, including tool geometry, cutting speed, feed rate, radial and axial cutting depth, and cutting direction and mode. To our knowledge, this is the first study to present cutting force patterns for bone milling as a function of the tool rotation. These data are useful for gaining a better understanding of and for validating models of the physical mechanisms governing in the bone milling process. Although the algorithm demonstrated reasonable accuracy for milling at higher feeds per tooth, proportionally less accurate results were obtained when milling at very small feeds per tooth (i.e. for c on the order of a few microns). This is due in part to the relatively poor approximation of the radial cutting force by equation (I), particularly for tools having rake angles >O" [5]. At very small chip thickness, the complex frictional and ploughing forces acting between the bone and the tool rake and clearance faces have a proportionally greater influence on the force components. The dependence of these parasitic forces on machining parameters such as cutting velocity and chip thickness is still not clear and remains a topic of ongoing research. Nevertheless, it is apparent that the simple chip load model presented here can be useful for helping understand and predict the complex interrelationships between milling conditions and forces.
Acknowledgements The authors wish to thank Jochem Roukema and Yusuf Altintas in the Manufacturing and Automation Laboratory at the University of British Columbia for providing invaluable assistance and the experimental equipment. This research was supported in part by NSERC Canada and by PRAXIMmedivision.
References
1. W.R. Krause, D.W. Bradbury, et al. Temperature elevations in orthopaedic cutting operations. J. Biomechanics. 15:267-275 (1982). 2. K. Denis, G. Van Ham. et al. Influence of bone milling parameters on the temperature rise, milling forces and surface flatness in view of robotassisted total knee arthroplasty. Proceedings of CARS, 300-306 (2001). 3. C. Plaskos, A.J. Hodgson, B.A. Masn, A generalized model for predicting force and accuracy in bone milling with application to computer-assisted knee arthroplasty. Proceedings of CAOS (2002).
294
4. Wiggins IU,Malkin S. Orthogonal machining of bone. J. Biomech. Eng 100, 122-130 (1978). 5. C. Plaskos, A.J. Hodgson, P. Cinquin, Modelling and optimization of bonecutting forces in orthopaedic surgery. Proceedings of MICCAI, 254 (2003).
SKALPEL-ICT: SIMULATION KERNEL APPLIED TO THE PLANNING AND EVALUATION OF IMAGE-GUIDED CRYOTHERAPY* ALEXANDRA BRANZAN ALBU Computer Vision and Systems Lab, Laval University, Que'bec (QC),ClK 7P4, Canada
DENIS LAURENDEAU Computer Vision and Systems Lab, Laval University, Que'bec (QC),GIK 7P4, Canada
CHRISTIAN MOISAN Dept. of radiology, Lava1 University, Que'bec (QC),GlK 7P4, Canada
DENIS RANCOURT Dept. of Mechanical Engineering, Sherbrooke University, Sherbrooke (QC),JIK ZRI, Canada
The SKALPEL-ICT project (Simulation Kemel Applied to the Planning and Evaluation of Image-guided Cryotherapy) aims at the developing of a virtual environment for the simulation of liver cancer cryotherapy. This simulator will mainly be used to plan the configuration and the positioning of the cryo-probes for an image-guided cryo-surgical intervention. Moreover, the prototype of the simulator will constitute an excellent training tool for future surgeons. The design of a simulator of this type involves several challenges related to the multidisciplinary nature of the project, which imposes the use of a modular approach. Presently, we are developing simultaneously three parallel models in the SKALPEL-ICT project: the thernzal model associated with the prediction of the temperature of an ice-ball, the geometric model associated with the 3D reconstruction of hepatic tumoun and the nzechanical model which reproduces the real deformations of the liver and force feed-back during the cryo-probe insertion process. The final phase of the project is to deal with the integration of these three models into a virtual environment using an interaction-centric modeling approach.
1. Introduction The virtual environment technologies offering support for medical applications are mainly designed for diagnosis, education, training, and rehabilitation. Whereas surgical training traditionally involves one-on-one experiences following a master-apprentice model, virtual environments dedicated to surgery simulation allow the trainees to learn new procedures, practice skills several * This work is supported by the strategic grant no. 234773 of the National Sciences and Engineering Research Council of Canada.
295
296 times, and eventually assess their level of competence before they operate on real patients. Moreover, the rather recent development of minimally invasive interventional techniques (endoscopy[ 1J, eye surgery, cryotherapy etc.) triggered an increased interest in the area of computer-based medical simulators. T h s paper describes the development of a virtual environment for the simulation of liver cancer cryotherapy. This simulator will be used to plan the configuration and the cryo-probes for an image-guided cryo-surgical intervention, and to visualize the thermal information corresponding to the intra-operative freezing process as well. Moreover, the prototype of the simulator will constitute an excellent training tool for future surgeons. The design of a simulator of this type involves several challenges related to the multidisciplinary nature of the project, which imposes the use of a modular approach. The following three sections of this paper describe the geometric model associated with the segmentation and 3D reconstruction of hepatic tumours, the mechanical model which reproduces the real deformations of the liver and force feed-back during the cryo-probe insertion process, and the thermal model associated with the prediction of the temperature inside the growing ice-ball. Section 5 gives details about the final integration of these three models into a dynamic virtual environment. The last section of this paper outlines the conclusions of our research. 2.
Geometric modeling
2.1. Segmentation The database for the geometric modeling consists of axial, saggital and coronal sequences of 2D magnetic resonance images corresponding to parallel and equidistant anatomical slices. In every acquisition plane, a large difference is noted between the horizontal intra-slice resolution (ca. 1.56 mm) and the vertical inter-slice resolution (ca. 10 mm). This difference cannot be minimized during the acquisition process, due to the respiratory motion artefact and to technical limitations. Therefore, 3D segmentation approaches are not reliable in our case and we must consider 2D segmentation techniques. We concentrate on two particular features of abdominal MRI data. The first one is the inhomogeneous texture of the liver tumours at a certain stage of their evolution (see Figure 1). For instance, large-sized liver tumours develop a lobular appearance due to the aggregation of several small-sized lesions surrounded by edema. The second feature is the non-uniform sharpness of the tumour boundary which may contain sharp segments alternating with “blurred” segments (see Figure 1). Due to infiltration into surrounding tissues, malignant liver lesions often present contours that do not reveal a clear-cut transition. To detect the region of interest, we create an isolabel tumour contour map using a multi-threshold technique and a similarity measure for the contours
297 (see Figure lb). In order to extract the isolabel contour map of the tumour, we need a minimum amount of information about its location in the liver. The radiologist is asked to select one single reference pixel located inside the tumour. Tumours with "blurred" contours of variable sharpness are detected with a pixel
a
h
C
Figure 1. a) Axial image containing several hepatic lesions. The arrow points at a tumour with nonuniform texture and "blurred" boundaries; b) Extraction of the isolabel contour map corresponding to a tumour containing the red starting pixel; c) Evolution of the pixel aggregation algorithm. The green area represents the tumour region which has not been detected.
2.2. Reconstruction Segmentation results are input data for a custom 3D reconstruction algorithm [3], which combines shape-based interpolation and contour-based extrapolation (see Fig. 2a and 2b). While interpolation generates intermediate slices between pairs of adjacent input slices, extrapolation perfonns a smooth closing of the external surface of the model. Surface rendering consists in the generation of a triangular mesh using a parametric representation of 2-D intermediate contours.
298
b
a
0
2
4
6
e
8 10 Dlsplacement(mm)
C
1
0.6
0.8
0.7
0.9
1.0
Ratio
f
Figure 2. a) Input configuration for the 3D reconstruction, viewed in the z direction; b) geometric model of the tumour corresponding to the input configuration in a); c) Reference mesh; d) Simulating the deformations during the cryo-probe insertion on the reconstructed tumour mesh; e)Five independent experimental measurements of forces during the compression of an ex vivo deer liver sample by a biopsy needle (light grey curves), and simulated forces on the reference mesh and on a reconstructed 3D tumour model. f) Mechanical law derived from experimental characterization.
299
3. Mechanical modeling Our mechanical model is based on the finite element based tensor-mass algorithm [4], which computes forces from a combination of local stiffness tensors attached to every mesh element. These tensors depend only on the mesh geometry at rest, and on the mechanical properties of the tissue. Therefore they can be computed in a preliminary step, while computation in the actual simulation is limited to a linear combination of stiffness matrices and displacement vectors, meeting real-time constraints. As shown in [ 5 ] , it is possible to extend the linear tensor-mass model for simulating different types of non-linear and visco-elastic mechanical properties. The first results were obtained using meshes consisting of a regular assembly of cubic elements (see Figure 2c). The same mechanical model can be applied on a non-uniform mesh derived from a reconstructed 3D tumour model (see Figure 2b). The mechanical tissue properties used for testing were obtained from experimental in vitro compression on deer liver membrane by a biopsy needle, since no in vivo mechanical data of liver tumour could be measured so far. Although differences can be observed between the two simulations due to the different mesh geometries, accordance between experimental data and both simulations can be considered satisfactory. Due to the thin tumour geometry (approximately 8 mm in thckness), forces on the tumour mesh tend to increase more slowly at hgher deformations, as it becomes almost entirely pierced. This comparison shows that the proposed mechanical model can be successfully applied to variable geometries. Figure 2e presents five independent experimental force measurements, as well as the simulated force on the reference mesh used to fit our model parameters, and the simulated force computed on the non-uniform tumour mesh. Compression speed was in all cases constant at 10 mm/s. The tissue model derived from these measurements was highly non-linear. Figure 2f shows the non-linear function introduced into the tensor-mass model to account for these properties. For low deformations the Young modulus was E = 3600 Pa, and the Poisson coefficient was kept constant at v = 0.4. Figure 2d shows several deformed mesh configurations at different time steps.
4. Thermal modelling When performing the cryo-surgical intervention, the surgeon must precisely determine the extent of cell destruction induced by the cryogenic probes. Obviously, the goal is to entirely destroy the malignant tissue while avoiding to harm adjacent healthy tissue. While, temperatures between -40°C and -60°C are considered to be effective lethal temperatures for a wide range of in vitro and in vivo tissues [6], magnetic resonance does not exist below 0°C.
300 Due to the lack of accurate thermal information during MR image-guided cryotherapy procedures, the hold time and the number of freezekhaw cycles are in most of the cases estimated empirically. The goal of the thermal module is to enhance real-time MR images with quantitative data on the growth rate of the iceball volume and tissue temperature during the procedure. This additional information should provide a better control of the cryo-surgical intervention, since it enables the quantification of the effect of a chosen freeze-thaw cycle. The most straightforward mapping of the thermal evolution during cryo-surgery implants a set of thermosensors at specific landmarks. While this approach provides a direct reading of local temperatures, the spatial resolution of its measurements is significantly coarse. In contrast, MRI combined with the energy equation of the freezing process can be used to compute the thermal map. We have developed a simple analytical method [ 7 ] , which uses experimental measures and numerical processes to compute thermal maps with good spatial and thermal resolutions. The temperature estimation procedure contains the following four main steps. 1) Image processing. Images acquired during the freezing cycle are subtracted from a reference image and normalized afterwards. This scaling is used for emphasizing intensity variations inside the growing iceball region. 2 ) Intensity-to-Temperature Correlation. The relationship was directly measured between the normalized intensity of the MR signal and the temperature of a given location within the iceball by means of thermosensor readings. 3) Intensity and thermal segmentations. The volumes and diameters of the iceball are extracted from the normalized images and are afterwards represented as functions of time and intensity. The straightforward conversion from intensity to temperature allows to substitute one for the other in the distributions. 4) Regressions in Time and Thermal Domains. The volume and diameter distributions are extrapolated to lower temperatures using two simple regressions in time and temperature domains respectively. Thus, it is possible to estimate temperatures at any time and at any distance away from the probe.
0
1
4
6
8
10
Time (mn) Figure 3. Measured and predicted temperatures at different distances from the cryoprobe axis [7]
301 Figure 3 shows the measured and predicted temperatures at two different distances (5mm and 8 mm) from the cryoprobe axis in ex vivo pig liver.
5. Modular integration based on the APIA approach The final integration phase of the project uses a task-oriented simulation framework, namely the Actor-Property-Interaction Architecture (APIA) [8]. APIA implements a new paradigm named Interaction Centric Modelling that allows a flexible design of virtual entities and their physical behaviours, such as viso-elasticity, dry friction during needle insertion, etc. Figure 4 illustrates an example using APIA in a virtual cryo-surgery context. In such a virtual world, a surgeon can make an incision with a scalpel or freeze the liver with a cryoprobe. The Cut interaction operates between the Cutable character and the Force Generator character, while the Freeze interaction occurs between a Heat Sensitive character and a Heat Generator character. These characters group some properties required by the interaction. An actor will own all the properties of inherited characters. Since the dependency between instantiated entities is generic, it is possible for the liver to inherit from a Heat Generator character and therefore take into account the fact that the liver is itself a heat generator.
Figure 4. Representation of some virtual entities and of their interaction during a virtual cryosurgery [81
302
6. Conclusions It is widely recognized that the simulation of surgery through realistic Augmented Reality (AR) environments offers a safe, flexible, and cost-effective solution to the problem of planning the treatment procedures as well as in training surgeons in mastering highly complex manipulations. In this context, SKALPEL-ICT represents a unique immersive and interactive augmented reality simulation environment for the MRI-guided cryotherapy of liver. SKALPEL-ICT addresses the simulation of two cryosurgery steps: i) the viscoelastic behavior of the tumor during the insertion of a virtual cryoprobe and ii) the freezing process when the cryoprobe is activated. These simulations imply that a 3D geometric model of the tumor is available. This model is reconstructed from segmented tumour slices in MR images. Unlike most simulation environments based on scene graphs, SKALPEL-ICT implements a physically-oriented simulator based on the concepts of actors, properties, and interactions. Actors are described by properties and interact with each other through interaction objects.
References 1. L. Hong, A. Kaufman, Y. Wei, A. Viswambharan, M. Wax, and Z. Liang "3D Virtual Colonoscopy ," IEEE Symposium on Frontier in Biomedical Visualization, IEEE CS Press, Los Alamitos, 1995, pp. 26-32. 2. A. Branzan Albu, D. Laurendeau, and C. Moisan, "Tumour detection in MR Liver Images by Integrating Edge and Region Information", ESSAIM Proceedings of Modelling and Simulation for Computer-Aided Medicine and Surgery, Rocquencourt, France, 2002. 3. A. Branzan Albu, 3.-M. Schwarz, D. Laurendeau, and C. Moisan, "Integrating geometric and biomechanical models of a liver tumour for cryosurgery simulation", in Proc. IS4TM : lnternational Symposium on Surgery Simulation and Sofi Tissue Modelling, (Juan-Les-Pins, France), juin 12-13 2003. 4. S. Cotin, H. Delingette, and N. Ayache, "A hybrid elastic model for real-time cutting, deformations, and force feedback for surgery training and simulation", Visual Computer, Vol. 16,2000, pp. 437-452. 5. J.-M. Schwartz, E. Langelier, D. Rancourt, C. Moisan, and D. Laurendeau, "Nonlinear soft tissue deformations for the simulation of percutaneous surgeries", Proc. MICCAI 2001, Utrecht, Netherlands. 6 . J. Baust, A. A. Gage, H. Ma, and C. M. Zhang, "Minimally Invasive CryosurgeryTechnological Advances", Cryobiology, ~01.34,1997, pp. 373-384. 7. R. Fournial, A. S. TraorC, D. Laurendeau, and C. Moisan, "An analytic method to predict the thermal map of cryosurgery iceballs in MR images", IEEE Trans. On Med. Imaging, vo1.23, no.l,2004, pp. 122-129. 8. F. Bernier, D. Poussart, D. Laurendeau, and M. Simneau, "Interaction-centric Modelling for Interactive Virtual Worlds: The APIA approach", Proc. ICPR 2002, (Quebec, Canada), Aug. 11-15 2002.
SIMULATION OF RADIO-FREQUENCY ABLATION USING COMPOSITE FINITE ELEMENT METHODS
T . PREUSSER AND H.-0. YEITGEN Ce Vis, University of Bremen, Universitatsallee 29, 28359 Bremen, Germany F. LIEHR, M. RUMPF AND U. WEIKARD University of Duisburg-Essen, Lotharstrasse 65, 4 7048 Duisburg, Germany
S. SAUTER University of Zurich, Winterthurerstrasse 190, 8057 Zurich, Switzerland This paper deals with a model for the simulation of the radio-frequency ablation of tumors. We present a discretization, which is based on an implicit time-stepping scheme and composite-finite-elements. This kind of spatial discretization is capable of resolving the complex structures of the organ, the tumor and the surrounding vessels while keeping the computational mesh simple and thus the effort small. Due to the implicit time-stepping scheme there is no restriction on the temporaI step size. A multigrid solver further enhances the performance of the method.
1. Introduction
The radio-frequency (RF) ablation of primary and metastatic tumors has become a promising treatment as an alternative to chemotherapy, radiotherapy and surgical resection. Together with appropriate mathematical, physical and biochemical models which describe the process, the success of the treatment can be estimated or even optimized. The goal is t o reduce the recurrency rate by a complete destruction of the malignant tissue. In this work we consider RF-ablation with bipolar systems: A probe, internally cooled and containing two electrodes, is placed in the vicinity of the malignant tissue. The electrodes are connected to a generator, which delivers a power of 30W - 200W at a frequency of f = 500KHz. Once the generator is switched on, an electric current waxms the tissue close t o the probe up t o temperatures of more than 60°C. Consequently the proteins of the heated tissue denaturate and its cells die. The treatment is successful, if all tumor-cells are destroyed by the denaturation of their proteins.
303
304 In areas distant from the probe the critial temperatures can only be achieved by propagation of heat away from the source. Thereby the perfusion of the surrounding tissue by large vessels and capillary blood flow has a significant cooling effect which needs to be modeled properly. To enlarge the volume of coagulated tissue (i.e. denaturated proteins) and to decrease the influence of perfusion the tumor can be penetrated with multiple probes simultaneously. This paper presents a simulation model for the RF-ablation, based on a previous work of Stein4, but being discretized with Composite-FiniteElement (CFE) methods2. CFE methods are characterized by the capability of resolving complicated geometries while keeping the computational mesh simple, in particular by not adapting the mesh but the basis-functions t o the underlying domain and structures. Thus, the model can take into account the complicated geometry of large vessels permeating the tissue without increasing the computational effort. In Section 2 we briefly discuss related work. A short review of the simulation model follows in Section 3. The discretization is presented in Section 4. Results of the application of the simulation will be shown in Section 5. Finally we draw conclusions in Section 6. 2. Related Work During the last decade the simulation of RF-ablation using monopolar probes has been modeled and simulated by several author^^?^. The models consist of a system of partial-differential-equations which describe the electric potential in the vicinity of the RF-probe, the evolution of heat in the tissue and finally the damage which is inflicted on the tissue. The models are discretized using Finite-Difference or Finite-Element (FE) methods on uniform rectilinear grids or tetrahedral meshes. Unfortunately, a lot of effort has to be put in building a computational mesh which resolves the complicated geometry consisting of parenchyma, malignant tissue, vessels, probes, etc. Only some of the models account for all the physical phenomena (nonlinear dependence of material parameters, perfusion, vaporization of water and nitrogene, e t ~ . which ~ ) steer the ablation process.
3. The Simulation Model The simulation of the RF-ablation consists of three parts: First the supply of energy in form of the RF-current must be computed. In a second step the conversion of this energy into heat and the propagation of this heat in the
305
Figure 1. Left: Schematic sketch of the computational domain and the the inscribed structures. Middle: Visualization of the probe penetrating the sample tumor. Right: Visualization of the probe together with a set of large vessels cooling the domain. T h e vessels have been obtained by a watershed segmentation from a real liver CT-scan.
tissue is calculated under consideration of the cooling effects of surrounding vessels as well as the vaporization and condensation of nitrogene and water. The Iast step of the simulation is to evaIuate the effect of the heat on the tissue and to estimate the damage which will be inflicted. The stages of the simulation depend on a huge variety of material properties, ranging from the electrical and thermal tissue attributes to biochemical properties. All these tissue parameters depend mostly nonlinearly on the frequency v of the electric current, the temperature T of the tissue as well as its dehydration and the state of protein-denaturation. In the next paragraphs we review the model presented by Stein4. To keep the presentation simple, we do not consider all material constants which appear in the original model. We reduce the material parameters to the constants instead, which are essential for understanding the model, and we refer the reader t o the work of Stein4 for a detailed description..
Electric potential Let us consider a domain R C R3and different materials with complicated boundaries to be present in this domain: We model a spherical tumor Rt C R to lie in the vicinity of R which is furthermore surrounded by vessels Ru c R (cf. Fig. 1.) In the implementation the boundaries of the different materials will be given as the zero level-set of certain functions. Now the aim is to find the distribution of temperature T : [0,T ) x R -+ R+ within a time interval [O,T)for T > 0. We start with the analysis of the electric potential generated by the electrodes of the probe. Since the frequency of the electric current is f = 500KHz, the electromagnetic field E can be considered quasi-static4, such that its potential P can be computed by the simple divergence equation. Find P : R + R such that -div(a(x)VP(z)) = 0 P(X)
= g*
in R,
(x) on r*.
(1)
306 Here a ( z ) is the conductivity of the tissue and I?* the boundary values of the potential at the electrodes (cf. Fig. 1.)
Heat distribution The distribution of the energy delivered by the electric potential P is computed using a heat-equation (also referred to as bioheat-transfer-equation) with appropriate source terms: Find T : R' x R -+ R such that
atT(z,t ) - div(X(z)VT(~:,t ) )= Qin
+ Qperf + Qpc
in 0,
T ( z ,t ) = T p r o b e T(z,t ) = T b o d y
on r p , (2) on dR. Here, X(z) denotes the heat-conductivity. The heat sources/sinks on the right hand side of the above equation are the following: Qin
(z) = (. z)
I V P (z) I
heating by the electric current,
- T ( z , t ) ) cooling by perfusion (cf. Pennes3).
Qperf(t,J:)= V(Z)(&,+,
The heating Q i n of the tissue is due t o the electric current. The term Qperf models the perfusion by capillary blood vessels in terms of the parameter ~ ( z ) We . steer the cooling by large vessels with ~ ( zas ) well (see below.) The cooling of the probe is modelled with an appropriate Dirichlet condition T ( z , t )= Tprobeon the boundary rp(cf. Fig. 1.) We consider the tissue t o be destroyed as soon as it is exposed to a critical temperature Tcrit = 54°C. But already at lower temperatures protein denaturation begins. This can be taken into account by the Arrhenius formalism4 in which the applied temperature is integrated over time.
Material parameters Investigations by several have shown that for a fixed frequency f = 500KHz the material parameters (T,A, Y depend nonlinearly on the temperature T, the coagulation- and the dehydration-state of the tissue. But t o keep the presentation simple, we assume the material coefficients to be piecewise constant over the subdomains R, Rt and R,. We model the different behavior of the materials by prescribing (T,X and v t o have jumps a t the corresponding interfaces between the regions of different tissue. In particular we have for the parameters involved here:
v(z))=
{
(%At,%) for (n,, A, v v > for
2 E
Ot,
( q d p , v p )
J: E
R,,
((To, Xo, Yo) else.
for J: E a p ,
4. Discretization
In the following section we describe the discretization of the above model using CFE methods. We denote the Sobolev-Spaces of functions vanishing
307 on := r+U r- or rpU dR with H;:(R) and H:fuan(R), respectively. The L2 scalar-product on R is denoted with (., .). Let us now assume we have converted the above inhomogeneous dirichlet boundary value problems (1) and (2) to homogeneous problems by the usual transformations. Multiplying equation (1) with a test-function and integrating by parts over R, we find the weak form of the problem: Find P E Hif(R) such that for all $ E H::(R):
Temporal discretization We proceed with the temporal discretization of the heat transport. Applying a backward Euler-scheme with time-step r , denoting the heat T(n7,.) at time nr with T" and multiplying with a test function we obtain the weak form again after integration by parts: For each time-step n find T" E H:fuan(Q) such that for all $ E H:fuan(R):
where R results from the transformation to homogeneous boundary conditions from above.
Spatial discretization using CFE In the following section we describe the spatial discretization using a CFE method. Let us assume we have constructed a standard hierarchical FE space with grids M o c . . . c MLmax and elements Ei E M ' . We emphasize, that each element E:. can be described as the union of six tetrahedra: Ei = UT;, . On the vertices z: E N Lwe have the standard hat-functions $: ($f(x;) = bij for all z; E MI.) These basis functions form the FE space V ' . The standard elements/grids do not resolve the complicated structure of the interfaces between the different materials, which we will denote with y in the following. Since we assume the material parameters ( 0 , A, v) to be noncontinuous across y it is essential for the quality of the approximation that y is resolved by the grid. To approximate the structural details sufficiently we indeed have to increase the depth I, and thus the number of degrees of freedom. With the CFE approach this not necessary due to the adaptation of the basis functions as shown in the following: Let us consider a node xlmax E MLmax and its hat-shaped basis function q5fmax. If the support ofmaX of &"ax is not intersected by y we keep the standard qbf--. But if is intersected by y we consider the set of intersection points Si := {z E
,Fax
308 Interface y I
Approx. interface y
dElement E
Tetrahedra T
Figure 2. A schematic 2D-sketch shows the coristruction of the CFE function. From left to right: An Element E is cut by the interface y. The element is cut into tetrahedra T and the interface approximated linearly. The intersections of the linearized interface 7 and the edges of the tetrahedra define the sub-tetrahedra on which the CFEbasisfcuntion 4tfe is constructed. Finally a sketch of a 2D CFE basis function is shown.
I
dT ny T c ofmax} of tetrahedra Tikwith the interface y. The points in Si define a linear approximation 'i; of y and the tetrahedra are cut by into smaller tetrahedra and prisms. Dividing the prisms into tetrahedra again, we end up with a collection Ti of sub-tretrahedra which cover o, and whose vertices lie on MlmaX or in Si. The set of local degrees of freedom is My"= (Nlmax U Si) n oi. The sub-tetrahedra form the local FE space
@" = span{@" I 4Ec(xk) = b 3 k for xk
E NF",
IT
45' is affine on each T E 12,4;;' from which we construct the new CFE basis functions
ICa,
= 0},
It remains to construct the C YE, ~[O,1] such that the appropriate jumpconditions imposed by the coefficient of the problem, are resolved. For the standard grid nodes x3 E NlmaX, we set = bt3. For the nodes x3 E S,, we compute the ratio of the coefficient on either side of the interface ;I.If we denote the value of a coefficient d ( x ) with d+ and d- on opposite sides of 'i; we set Q , ~ = (1 - c ) d + / ( d f d - ) if d ( x , ) = d+ and aZ3= c d - / ( d f + d - ) else. Here, c is the ratio of how 7cuts the edge of the corresponding hexahedral element on which x3 lies (cf. Fig. 2.)
+
Implementation As described above the definition of the interfaces y is obtained as the zero level-set of a certain function f. Thus, the test whether an element E is cut by y reduces t o a comparison of the sign of f at the vertices of E. In the implementation, the construction we have described in the last section need not be done on every element E which is intersected by an interface y. Indeed a straight forward analysis shows that only the proportion c of the cut-edge and the volume of the sub-tetrahedra is of importance. In a
309 preprocessing step a lookup-table is computed containing the local basis functions on an element E for all different cases (i.e. the combinations of signs of f at the vertices) of how the element E is cut by the interface. During matrix assembly the basis functions only have t o be scaled with the size of the sub-tetrahedra and put into the system matrix.
A multigrid solver Once the system matrix is built we can set up a multigrid solver using restrictions and prolongations which respect the CFE basis-functions. Thus, a hierarchy of basis-functions respectively a hierarchy of operators/matrices is generated which resolve the interfaces on coarse levels as well. In the applications shown in the next section, we choose two pre- and post-smoothing steps. Smoothing is done by a conjugate-gradient method. 5. Applications
In Figure 3 the results of the simulation with one respectively three cooled bipolar applicators are shown. The geometry of the computational domain is as shown in Figure 1. The segmentation of the vessels is taken from a CT-scan of a human liver. On the slices orthogonal t o the applicator one clearly sees how the influence of the perfusion destroys the symmetry of the heat profile. This underlines the difficulties of RFablation in the neighborhood of large vessels and the importance of simulations. The parameters used for the computation are the following: Grid-width h = 2-6, number of dofs (26 1)3, probe up = 1.0, tumor (ot,At, vt) = (0.2,1.2,0.05), vessel (o,, A, vu) = (0.66,0.8,500), surrounding tissue (00, X 0 , v o ) = (0.167,1.0,0.005).
+
6. Conclusions and Future Work
We have shown a CFE discretization of a simple model for the simulation of RF-ablation. The results show that the CFE method can resolve the complicated geometry of vessles and organs without increasing the number of degrees of freedom. Future work include an improvement of the model, including the non-linear behavior of the material parameters. A more detailed presentation of the results including color figures and movies can be found on the web page http: //www .mevis.de/-tp/rfitt
Acknowledgements The authors would like t o acknowledge Dr. A. Roggan and Dr. T. Stein from Celon AG for fruitful discussions on the topic.
310
P
-0.6
P = -0.2
P=O
P = f0.2
P = +0.6
t = 0.02
t = 0.04
t = 0.06
t = 0.08
t = 0.1
=
Figure 3. Results from the simulation are shown. First row: Isosurfaces of the potential of a bipolar probe. Second row: The 330 (Kelvin) isosurface of the temperature distribution is shown for successive times of the simulation: The perfusion by large vessels prohibits heating of the tissue near them. Third row: Cuts orthogonal to the applicator. T h e color codes the temperature on a ramp from blue (310 K = T b o d y ) t o red (325 K > Toit.) Fourth row: Result from a simulation with three cooled biploar probes.
References 1. T . E. Cooper and G. J. Trezek. Correlation of thermal properties of human tissue with some water content. Aerospace med., 42:24-27, 1971. 2. W. Hackbusch a n d S. Sauter. Composite finite elements for the approximation of PDEs on domains with complicated micro-structures. Numer. Math., 75:447-472, 1997. 3. H. H. Pennes. Analysis of tissue and arterial blood temperatures in a resting forearm. J . A p p l . Physiol., 1:93-122, 1948. 4. T. Stein. Untersuchungen zur Dosimetrie der hochfequenzstrominduzierten interstitiellen Themotherapie in bipolarer Technik. Number 22 in Fortschritte in der Lasermedizin. Miiller, Berlien, LMTB, 2000. 5. S. Tungjitkusolum, S. T. Staelin, D. Haemmerich, et al. Three-dimensional finite-element analyses for radio-frequency hepatic tumor ablation. ZEEE Transactions o n Biomedical Engineering, 49( 1):3-9, 2002.
AN INTERACTIVE PLANNING AND SIMULATION TOOL FOR MAXILLO-FACIAL SURGERY
G. BERTI, J. FINGBERG AND J. G. SCHMIDT* C&C Research Laboratories, NEC Europe Ltd. Rathausallee 10, 0-53757 St. Augustin, Germany E-mail: { b e r t i , f i n g b e r g , s c h r n i d t } @ c c r l - n e c e .d e T. HIERL University Hospital Leipzig, Department of Oral and Madlofacial Surgery, Nurnberger Str. 57, 0-04103 Leipzig, Germany E-mail: h i e t h e d i z i n . u n i - Z e i p z i g . de
We present a chain of software tools designed t o help t o plan and predict the outcome of distraction osteogenesis operations in maxillo-facial surgery. The chain starts off with a C T image of the patient’s head, which is segmented and used to produce a surface mesh of the skull. Next, the surgeon interactively defines the cuts and the parameters of the distraction process. This information together with the CT data are used t o generate a finite element (FE) mesh, including boundary conditions, prescribed displacements or forces. After the F E problem is solved on a remote high-performance compute server using linear or non-linear solution methods, the resulting displacements of bones and soft tissue can be visualized in various ways in order t o assist the surgeon in designing the appropriate surgery operation. The entire tool chain is developed as a Grid application with the overall aim of making advanced simulation accessible to the non-technical clinician, providing transparent and secure access t o remote computing facilities via a novel layer of middleware.
1. Introduction Severe malformations of the midface such as maxillary retrognathia or hypoplasia can be treated by distraction osteogenesis. During an operation the appropriate bony part of the midface is separated from the rest of the skull (osteotomy) and slowly moved into the ideal position by way of a distraction device (cf. Figure 1). Thus even large displacements over 20 mm *The work of these authors is supported by the European Commission under grant IST-2001-37153
31 1
312
can be treated effectively.
Figure 1. Patient before treatment (left) and at end of treatment with distraction device mounted (right).
A critical point in this procedure is osteotomy design and the resulting outcome with respect to aesthetics. In the current clinical practice, planning is based on CT scans and the surgeon’s experience. Our tool chain allows the surgeon to specify arbitrary cuts of facial bones (pre-processing), to simulate the displacements of bones and soft tissue, and to visualize the results in an appropriate way (post-processing). It therefore provides the possibility to predict and compare the outcome of different surgical treatments in silico. The overall goal of our approach is to give the average clinician access to the advanced simulation technology of such surgical planning tools. Physicians typically lack both technical expertise and time needed to prepare input for numerical simulations. Nor do they generally have access to the necessary HPC (high performance computing) hardware and software. Thus, a successful approach must provide a specialized toolchain which performs the necessary preprocessing tasks autonomously - except the proper “virtual surgery” - and gives transparent and secure access to remote HPC resources. The GEMSS project 3 1 of which the present work is a part, develops middleware aimed at solving these problems, thus bringing advanced simulation services closer to the practitioner’s desktop. A number of researchers have obtained results in the field of computational maxillo-facial surgery planning. Koch describes a system based on linear elasticity, where osteotomies are specified on 2D pictures. Schutyser et a1 use a 3D “virtual knife” and emphasize the real-time aspects of
313
simulation, trading accuracy for speed. Zachow et a1 l1 use a specialized version of the Amira visualization system for most parts of the toolchain, including mesh generation. Simulation is restricted to linear FEM models. Gladilin extends these linear models to first non-linear FEM simulations of the distraction process, using a St .Venant-Kirchhoff material model. Our approach is different since it is focused on autonomous usage by non-technical users, offering transparent access to high-performance platforms, thus enabling the user to employ compute-intensive, high-fidelity numerical methods (cf. Sec. 4). The rest of this paper is organized as follows: In Sec. 2, we give an overview over the components of the toolchain. Section 3 discusses in some detail the interactive osteotomy tool. Then, we give some background on the FEM simulation in Section 4.
2. A Toolchain for Maxillo-facial Surgery Planning
For the complete task of maxillo-facial surgery planning and simulation, a number of sub-tasks have to be solved. The raw CT image has to be segmented, and optionally registered to a template head image. Next, a geometric 3D representation of the bone surface is generated from the image, which is handed over to an interactive surgery specification (bone cutting) tool. The output of this tool must be checked for consistency and incorporated into a volumetric FEM model. This step involves 3D mesh generation, application of boundary conditions and additional mesh checking, e.g. for spurious disconnected components. Then, the FEM simulation is run, and finally the results are visualized and interpreted by the surgeon. Now, clinicians typically are not experts in image processing, meshing or FEM simulation. So, most of the toolchain should run in an automated way, with the obvious exception of the bone cutting task, to be described in the next section. Some of the tasks, most importantly the FEM analysis, but possibly also mesh generation, require substantial computing resources which are generally not available at clinics and may be difficult to use (e.g. supercomputers or clusters). Thus, transparent access to remote computing facilities is necessary. On the other hand, some surgeons may already routinely use some third party software, such as volume visualization, for their surgery planning, and may want to incorporate such tools into the toolchain. Also, when improved or new functionalities become available, it is useful to be able to easily replace existing tools or to offer the new tools as an alternative.
314
These considerations led us to a highly modular toolchain composed of loosely coupled components, as opposed to a monolithic maxillefacial surgery planning application. We use the Triana workflow editor lo to manage the toolchain. Triana offers easy workflow configuration via a graphical programming language, and is also suitable to wrap remote execution of tools in a transparent way, for instance via the GEMSS middleware
'.
3. The Virtual Bone Cutting Tool A crucial step in the toolchain is the specification of bone cuts and displacements. A suitable tool for this task should have the following properties: 0
0
0
It should not impose constraints on the number of displaced bone components and the displacements It is ergonomic and supports the user by giving visual feedback on the user input It provides quantitative aids like lengths measurements
We choose to build our cutting tool on top of OpenDX 7 ,because it offers a wide range of visualization features and sufficient interaction capabilities. Also, it is easily extensible by user-provided modules. This approach has the advantage of quickly arriving at a working prototype. The flow of action within the cutting tool is as follows: First, the user specifies a number of cuts as closed polygons by selecting points on the bone surface, which is represented by a surface mesh. Then, he chooses bone component (s) to be distracted, and specifies the corresponding translation(s) and rotation(s). An important feedback we plan to integrate in the near future is the visualization of the displacements by moving the components to their specified positions. After all cuts have been specified, they are converted into threedimensional volumes which are used to actually apply the cuts on a volumetric model. The conversion takes place in two steps: First, the non-planar polygon is triangulated, and second, this triangulation is extruded using the normal directions at each vertex, with a user-specified thickness. The complex geometry of the human head may turn the placement of cuts into a tedious problem. In order to support the surgeon, we provide clipping planes and selectable transparency of the bone. In addition, the 3D location of the cuts is visually emphasized by using balls and tubes for the vertices and edges of the polygons, see Fig. 2.
315 Another difficulty is the verification of the separation of components.
It may happen that parts intended t o be separated by cuts are in fact still connected by small bridges, which would grossly distort the subsequent simulation. A possible source of such bridges may be segmentation artifacts such as the missing separation of upper and lower teeth. For finding such bridges, we have developed a coloring tool which colors a component in a wavefront starting from a selected seed point. A bridge will than be detectable as a “color leak” (cf. Fig. 3).
Figure 2. The cutting tool: Front view (left) and side view with clipping. The current cut is highlighted. The cut separating lower and upper jaw is neccessary t o overcome segmentation artifacts introducing unphysical connections
Figure 3. Visualization of bridges by using a wavefront coloring of components. It is clearly visible by the continuous coloring that the maxilla is not entirely cut in the right part (left image). It is also visible that the lower jaw is separated from the rest of the skull.
316
4. Distraction Simulation
The simulation of the distraction process is done by a Finite Element analysis. Both time and memory requirements of such an analysis are far beyond the capabilities of a single workstation or P C and are therefore carried out remotely on a parallel machine via the GEMSS middleware. The memory requirements depend on the resolution of the mesh and on the memory consumption of the linear solver, Since our FE discretization uses displacement based elements like linear or quadratic tetrahedra and hexahedra, the number of unknowns in the resulting linear systems is approximately 3-6 times the number of elements. Typical meshes consist of lo5 to lo6 elements, resulting in memory requirements grossly ranging from 3 GByte to 30 GByte for linear elements. In order to solve the resulting systems we use iterative solvers, namely preconditioned Krylov subspace solvers. We use algebraic multigrid preconditioning if possible, since it shows optimal complexity and ILU preconditioning if necessary, which is a more stable but less fast method. The user can choose between different levels of elasticity models. For a fast but possibly rough estimation of the resulting soft tissue deformations we supply a linear elastic material model. In order to enhance the accuracy we discretize the resulting equations with highly efficient EAS elements A linear problem with 100,000 elements is solved in about 5 minutes on 16 nodes of a P C cluster with an AMD Athlon 1900+ CPU and 2 GB of memory at every node. For a more detailed analysis, the user has to provide details about the distraction process in time, i.e. details about the velocity of the prescribed displacements or the time dependent changes of the distracting forces. Here the material is modeled by a viscoelastic material law, based on a geometrically nonlinear hyperelasticity. These computations last several hours, but give the surgeon a detailed view on the development of the resulting displacements, forces and stresses in time. Non-linear computations are crucial for obtaining realistic value for the relations between displacements and forces 1 stresses. Right now we distinguish between bone and soft tissue and model both materials by isotropic laws. In the future we are planning to incorporate additional information from the C T image and a template head model based on combined C T and MR data in order to get a more realistic distribution of the material parameters. Those parameters and the use of specialized material models for skin, muscles and other kinds of tissues are expected
’.
31 7 to further improve the accuracy of our simulations. For details on more complex models the reader is referred t o and 6 .
Figure 4. Patient before treatment (left) and simulated surgery (right), using volume rendering of original and deformed CT image
5 . Conclusion
The presented tool chain enables the surgeon t o predict the outcome of a distraction osteogenesis for an arbitrary set of cuts and distractions and is therefore a valuable tool for planning such treatments. By using advanced Grid computing infrastructure, the crucial time and memory intensive parts of the tool chain can be executed remotely on a HPC server. This enables the surgeon to get results from adequate state-of-the-art simulation within acceptable times, without needing to worry about technical details or security issues. The development of the tool chain is still ongoing work. The clinical evaluation of the tool is still pending and we are looking forward t o improve the tool by incorporating feedback of the medical experts who are testing it. In particular, we plan to make the cutting tool more ergonomic by offering automatic fitting of cut lines t o the skull surface geometry, and to use registration of a template head t o import auxiliary data like clipping etc. Another important improvement will concern removal or reduction of metal artifacts which may distort the simulation. We also plan t o use a registration approach to map more soft tissue details like muscle strings, and t o compare quantitative differences between simulations run with different material laws and resolutions. This will give us a clearer picture of the
318
tradeoffs between computation time, sophistication of material modeling and accuracy of simulation.
Acknowledgments The middleware used for the remote execution of the simulation jobs was developed by our partners in the GEMSS project. Special thanks go to Junwei Cao who integrated the entire toolchain into Triana and the GEMSS middleware. Most of the image processing tools were developed by F. Kruggel and G. Wollny at the Max-Planck-Institute of Cognitive Neuroscience in Leipzig.
References 1. S. Benkner, G . Berti, G. Engelbrecht, J. Fingberg, G. Kohring, S. E. Middleton, and R. Schmidt. GEMSS: grid infrastructure for medical service provision. In Proceedings of HealthGrid 2004, 2004. 2. Y . Fung. Biomechanics: Mechanical Properties of Living Tissues. Springer, Berlin, 2nd edition, 1993. 3. The GEMSS project: Grid-enabled medical simulation services. http: //m. gems .de, 2002. EU IST project IST-2001-37153, 2002-2005. 4. E. Gladilin. Biomechanical Modeling of Soft Tissue and Facial Expressions for Craniofacial Surgery Planning. PhD thesis, Fachbereich Mathematik und Informatik, Freie Universitat Berlin, 2003. 5. R. Koch. Methods f o r Physics Based Facial Surgery Prediction. PhD thesis, Institute of Scientific Computing, ETH Zurich, 2001, Diss.No.13912. 6. W. Maurel. 3D Modeling of the Human Upper Limb including the Biomechanics of Joints, Muscles and Soft Tissue. PhD thesis, Ecole Polytechnique Federale de Lausanne, 1998. 7. OpenDX homepage. http://wwv. opendx .org, 2000. 8. F. Schutyser, J. V. Cleynenbreugel, J. Schoenaers, G. Marchal, and P. Suetens. A simulation environment for maxillofacial surgery including soft tissue implications. In Proceedings of MICCAI 1999, pages 1210-1217, 1999. 9. J. Simo and M. Rifai. A Class of Mixed Assumed Strain Methods and the Method of Incompatible Modes. Int. J. Num. Meth. Engng., 29:1595-1638,
1990. 10. Triana homepage. http://uww. triana. co.uk/, 2003. 11. S. Zachow, E. Gladilin, H.-F. Zeilhofer, and R. Sader. Improved 3D osteotomy planning in cranio-maxillofacial surgery. Lecture Notes in Computer Science, 2208:473-481, 2001.
Robotic Interventions
This page intentionally left blank
PRINCIPLES OF NAVIGATION IN SURGICAL ROBOTICS DOMINIK HENRICH AND PHILIPP STOLKA Lehrstuhl fur Angewandte Informatik III (Robotik und Eingebettete Systeme) Fakultat fur Mathematik, Physik und Informatik Universitat Bayreuth, 95440 Bayreuth E-Mail:
[email protected],http://ai3. injuni-bayreuth.de
In this paper, we propose a framework for the different types of navigational problems in surgical robotics. Using robots in medicine and especially in surgery requires an adequate representation of and reaction to a changing environment. This is achieved by modeling at different abstraction levels throughout the process, ranging from 3D imaging modalities which reflect the environment geometry to finding appropriate control parameters for actual motion. Between global navigation and control, we introduce the concept of local navigation into surgical robotics, i.e. concurrent creation and maintenance of a local environment map for navigation purposes. This intermediate level of sensory feedback and processing allows to react to changes in the environment. Furthermore, local navigation permits sampling of additional information which may be unattainable before process execution or only with reduced precision. We illustrate this idea of nested control loops on the basis of car driving and a specific surgical application robot-based milling at the lateral skull base.
~
1. Introduction Robotic applications with changing environmental properties require a precise and up-to-date representation of the environment in order to fulfill specific tasks like e.g. safe path planning. This representation has to encompass several orders of abstraction, precision and timeliness. Thus, data sampling occurs at different instants of time during process planning and execution. Current surgical robot systems rely mainly on two sources of information: global spatial data sampled during a planning phase before process execution, which is then used statically for global navigation, and local data sampled during the process, which is fed back and used in a non-spatial context in open or closed loop controllers of the process. Usually, the former lacks either resolution, segmentability, or both, while the latter only persists during the instant of sampling and is discarded immediately after entering into the control cycles. However, there exist applications, especially with autonomous robots, for which an additional information type - intraoperative, spatial, current and persistent sensor data - proves necessary to cope with uncertainty, measurement errors, and incompleteness of data. We describe how this kind of local information can be used together with the other navigation and control modes in a consistent manner, i.e. how it is integrated into a common handling strategy.
321
322 In Section 2, we give a short overview of the state of the art in robotic and surgical navigation. Section 3 explains the proposed navigation and control principles on the basis of an everyday example and a surgical robotic system. In Section 4, the proposed definitions are applied to the surgical robotic system RONAF (for a complete discussion, refer to e.g. [Henrich02]). We close with a discussion and possible future applications in Section 5.
2.
State of the Art
For navigation in autonomous mobile robots, there usually exists a spatial map of the environment which may or may not be available before startup of the robot. While the first case is trivial in terms of map generation and the robot system can concentrate on the tasks of localization and path planning based on this map, the second case proves more interesting in a more general sense of navigation. This leads to the field of simultaneous localization and mapping (SLAM), dealing with the twin problem of localization in a uncertain and incomplete map and mapping based on uncertain localization. The odometry of mobile robots is typically imprecise, so exact global localization is only possible in the vicinity of landmarks or with the help of a global positioning system. When neither is available, the robot has to rely on estimations. Both for updating and reading from the map, this introduces uncertainty. Thus, although continuous sensor data sampling enhances and updates the environment model, its value is decreased due to position inaccuracy. The robot has to continually re-register itself with the (inaccurate) map, based on (uncertain) measurements. Surgical navigation systems (like infrared optical trackers, magnetic or ultrasound trackers), on the other hand, are useful for tracking the absolute position of objects (instruments or the patient) within the operating theatre. This ability is employed for conventional, computer-assisted interventions, where the surgeon performs the action manually while having an enhanced sense of position and orientation of his instruments relative to interesting structures, allowing for more precise or even previously impossible interventions. This strategy obviously requires the collection of preoperative data to compare it with the current instrument pose. Conventionally, this is achieved by acquiring a 3D image (e.g. per computer tomography). For a known and restricted application area, an alternative may be image-less navigation, based on generic anatomical models (atlases) which are registered with and adapted to the patient’s individual features by sampling appropriate sets of surface points. With either method, the instrument is being tracked intraoperatively and co-displayed with the anatomy. Navigational issues like region avoiding or path planning are up to the surgeon.
3.
Principles of navigation
In the following, we cover the different identified modes or principles of navigation. For each principle, we define the relevant terms, the principle itself and
323 illustrate it with two examples. On the one hand, we refer to a car driver who is to guide his vehicle from a start location to an end location in an unknown environment. On the other hand, we describe the actual use of the respective navigation principle in surgical robotics on the basis of the robot system RONAF* for otolaryngological surgery (for details, see Section 4).
3.1. Registration Before describing the different navigation principles, we first have to define the concept of registration. For a correct environment representation, the objects relevant to the process need to be in the correct spatial and temporal relation to each other, i.e. the transformation of the respective associated local coordinate systems must be known. Registration is defined as the determination of this transformation. The transformation itself is also covered by the term registration. Conceptually, the procedure of registration is performed by identifying pairs of points or surfaces in two data sets. For surgical robotics applications, an intuitive way to provide this identification is to use the robot as a localizing device, pointing at distinct features clearly distinguishable in both the patient’s anatomy and the existing data set to be registered with the robot. Another option is to use imaging modalities or external tracking devices like navigation systems (e.g. BrainLab VectorVision) which determine the relative positions of both the patient and the robot together (co-registration). If some kind of relative motion occurs and is noticed through tracking or massive data mismatches, i.e. registration is lost, then reregistration becomes necessary. Registration in the car-driver example corresponds with finding the own location in a street map. Re-registration should only be necessary when the driver has fallen asleep and has lost orientation. In the surgical robot system RONAF, registration between milling path and the patient is equivalent to location planning of the implant bed. For generic implant bed milling paths, this has so far been performed by pointing at the origin and axes of the implant coordinates with the robot itself, using a force-following scheme (Hybrid N/P Control, [Stolka03]). Note that this scheme does not require a global map. Gathering a 3D ultrasound map directly with the robot is possible as well. Since the ultrasound sensor is rigidly attached to the robot, this map is implicitly registered and can be surface-matched with e.g. preoperative CT scans.
3.2. Global navigation with preoperative map For global navigation with a preoperative map, we require a data set of the intervention region which serves as a global map. This map is typically acquired preoperatively and is mainly used for planning. Locations and paths can be described
Robot-assisted Navigationfor Milling at the Lateral Skull Base (Robotergestutzte Navigation zum Frasen an der lateralen Schadelbasis)
* RONAF:
324
within this map in a global fashion. Obviously, this data set needs to be registered with the actual environment before process execution. This navigation principle does not impose any strict temporal restrictions on data sampling and process execution; however, precision of the map and of the registration are of major importance. As a comparison, one might consider buying a complete street map of an unknown city. After having localized oneself in this map, the own position can be tracked. A route can be planned, but includes only information known at the time of map creation - crossings, streets, addresses, but no current data. A surgically relevant example is the generation of a 3D image of the patient with a modality like computer or magnetic resonance tomography, serving as a global map for navigation. Besides the path planning necessary for an autonomous robot, a part of this navigation might be position optimization for implant components [Waringo03c]. For RONAF, one possible intervention is the autonomous milling of an implant bed. The position and orientation of this cavity relative to the skull bone has to be planned before execution, since later modifications are difficult or impossible. The surgeon provides a starting position for the implant, and an iterative optimization algorithm searches for an optimal fit of the implant’s and bone’s upper and lower contours.
3.3. Global navigation with intraoperative map Global navigation based on an intraoperatively acquired map is conceptually similar to the previous principle. Here as well, one has knowledge of the complete environment via a global map. However, acquisition may take place shortly before process execution, or even occasionally during the intervention. The assumption of a current environment representation becomes more plausible. Moreover, in this case co-registration is possible, i.e. to combine the data sampling with the localization of the robot in the image data. The main goal of this principle is to provide global updates that are as current as possible. For the car driver, this might be equivalent to a street map individually generated by a routing tool, tailored exactly to his needs. In surgical robotics, an example of global navigation on intraoperatively acquired data may be to modify the robot path based on tracked 3D ultrasound images. In comparison to e.g. preoperatively available CT images, ultrasound provides current data with higher axial resolution. Especially when using techniques such as coded excitation and matchedfiltering, depth (axial) resolution of the US data can reach 15pm ([Federspil03b]). Sampled with a robot-held US probe, lateral resolution can be twice that of conventional CT scans. This increased precision can be used to modify the planned path according to accumulated knowledge from the competitive sensor data fusion of CT and US. Usually, a path modzjication should only be performed to avoid critical regions that show up after sampling of the new intraoperative data (Figure 1).
325
Y
Figure I : Conservativepath modification, lifting miller over critical regions (I implant, K bone, D dura/brain, /R/ extent of critical region)
3.4. Local Navigation In contrast to the above navigation principles, local navigation does not require a map of the environment before the process starts. In fact, execution is begun without prior knowledge. The robot is positioned in the execution area by the operator. A local map is then continually filled with information sampled during execution, realizing an iteratively enhanced environment representation. The added information has two important properties: it is necessarily local in nature, and it may provide more precise knowledge of the environment than global sensors could. Since the position of the sensors relative to the robot is known, this map and the robot are implicitly registered. Robot and environment are necessarily registered as well (provided that no registration loss occurs). The information is sampled in tight temporal relation to the process, so it can be assumed to be as up-to-date as possible. Furthermore, data can be acquired through local sensors that deliver more precise information. For our car driver, local navigation might be e.g. scanning a street junction before crossing it. Compared with a complete map of the city, this information is highly local, but also provides a more current and precise view of the situation (other cars, traffic jams, road works) than could be expected from a global map. In surgery, local navigation may e.g. mean building a histological map from sensor readings that allow tissue discrimination. This can be sensors for nerve proximity detection through electromyography (EMG) or impedance spectrometry. Tissue discrimination based on forcehorque readings from the process allows to classify tissue as bone, dura, or air ([Stolka02 I), similar to vibrotactile measurements for diagnosis of certain histological changes like cancerous processes ([Plinkert98]). All this information is spatially registered with the preoperative data (via the initial patient-to-data registration step) and can thus be used as a persistent information source for e.g. path modification procedures (see Section 3.3).
3.5. Control Control encompasses the data cycle of measurement of data elements from the process through a measurement module (data sampling), computing a reaction in a controller that is fed to an actuating element in the process, and possibly a data feedback path to the controller for closed-loop control. For effective control, tight temporal coupling between these steps is paramount. Pure control does not require
326 any kind of spatial information to work; it serves as a reactive navigation principle without any persistent mapping functionality. Without knowing his current or the target location, our car driver controls the trajectory of his vehicle on a winding road through small steering actions, counteracting curves, wind gusts etc. In surgery, one example for process control actions is force-bused control of milling speed ([StolkaO 11, [Federspil03c]). As excessive forces can harm the patient, they need to be monitored and controlled. In the RONAF project, absolute force is measured and fed back to the robot speed controller, avoiding thermal injury [heat necroses) and emergency stops. Trajectory corrections under external force during milling (e.g. [Engel02]) are another important control goal in surgical robotics. Especially longer tools like millers or laparoscopic needles suffer from deformation during the process. For an autonomous intervention, this has to be modeled and counteracted. These controllers have to be fast, but may be ignorant of the current global position. Stretching this definition of control, the concept of Navigated Control integrates map-based navigation with control of a hand-held surgical tool [Hein02]. Demonstrated on a milling system, the tool is switched on and off according to its position relative to safety regions defined in 3D image data.
4.
General Navigation System Architecture
By laying out the mentioned four navigation principles, we will now describe a framework for sensor integration from a system architectural point of view. All process phases relevant to a surgical robot-assisted intervention (preoperative data acquisition, intervention planning, intraoperative registration, sampling of intraoperative data, control, and actual process execution) are reflected in the general navigation system architecture in Figure 2 and by the described navigation principles. Therefore, this concept should be applicable to almost any kind of surgical robot system regardless of its nature, be it autonomous (as in the RONAF system), synergistic (e.g. the ACROBOT system), telemanipulated (e.g. the A73 system), or passive (which might even omit a robot component). Almost all surgical procedures are subdivided into two main phases, one for the preparation (preoperative phase, e.g. including implantation planning) and one for the execution of the surgical intervention (intraoperative phase). The tool path or motion space computed in the preoperative phase is to be adhered to during the intraoperative phase. Depending on the actual system used, all of the navigation principles (as described in Section 3) can be described as four sensory feedback cycles.
327 Preparation Phase
Execution Phase
Figure 2: General navigation system architecture including thefour navigation principles (A through 0) We implemented this architecture in our demonstration system RONAF. One goal of this project is planning and autonomous milling of implant beds for implantable hearing aids. It is based on an industrial robot (Staubli RX130, serial six DOF, 0.3mm relative accuracy), a real-time controller (68040/40MHz Adept CS7, V+ 12.3) and an external planning PC. It is equipped with a surgical miller (Aesculap Microspeed EC/GD657, 30.000min-'). Local sensors include a 6D forcekorque sensor (JR3KMS 90M3 l), ultrasound probes cf= 1.. .4MHz), an electromyography station (Viking IV),and CT imaging and an IR camera serving as global sensors. A. The embracing outer feedback cycle (path A in Figure 2) begins with the preoperative planning phase, i.e. the acquisition of global 3 0 slice images of the situs and the determination of the milling volume together with the computation of the milling path. The imaging modality can be any of CT, MRT, or 3D tracked ultrasound. The exact procedure for the determination of the milling volume depends on the intervention. For a mastoidectomy, it consists of mastoid bone segmentation. For an implant bed milling, rastered and layered bone representations are generated. The raster representation is used for path computation as described in [Waringo03b]. For a shorter planning phase, the exact position of the implant can be optimized automatically so that the implant does not break through the lower bone profile (see Section 3.2). B. In the case of unknown relative position of situs and robot, the global map and the situs have to be registered (path B). This is almost always the case
328 when co-registration is impossible, and can be achieved through global sensors - by manual pointing or with a conventional navigation system. C. With local sensors, a map for local navigation can be built successively (path C). Local sensor information can be gathered through e.g. forcekorque sensor readings (F/r) at the miller, electromyographic excitation, nerve impedance or temperature readings in the milled area [Federspil03c]. With the F/T sensor, contact state information can be sampled. This information is entered into a 2.5D representation of the intervention region and can be used to avoid critical regions in the future ([Stolka02]). D. Finally, actual milling is speed controlled (path D). Closed-loop control of measured forces, modifying the robot speed, effectively constrains maximum temperatures. We use proportional control with an absolute force target value of F, = 15N, constraining temperatures to 60°C ([Fuchsberger86], [IPAOO]), which avoids bone heat necroses and leads to a more “natural” milling procedure.
5.
Conclusion
We have presented a general model for navigation in surgical robotics. We introduced the term local navigation to describe on-line data sampling during an intervention, allowing for more precise andor current information than global sensors. Integrating the four navigation principles (global navigation, based on both pre- and intraoperatively acquired images, local navigation, and control), we defined a conceptual framework accommodating sensors in a modular fashion. Several of the navigation cycles have already been closed in the RONAF project, among these the outer ones - global navigation with preoperative 3D images and registration via the robot - and the inner cycle for force-based speed control. In the future, we are going to close the remaining navigation cycles in our system, showing the efficacy of the proposed scheme. Furthermore, we will explore the possibilities of a limited set of sensors, since e.g. FIT readings are useful in several of the mentioned cycles.
Acknowledgements This work is a result of the project “Robotergestutzte Navigation zum Frasen an der lateralen Schadelbasis (RONAF)” of the special research cluster “Medizinische Navigation und Robotik” (SPP 1124) funded by the Deutsche Forschungsgemeinschaft (DFG), performed in cooperation with the “Zentrum fur Schadelbasis- Chirurgie der Universitatskliniken des Saarlan des” in Homburg/Germany. Further information can be found at http://ai3 .inf.unibayreuth.de /proj ects/ronaf/.
329
References [Engel021 Engel, D.; Raczkowsky, J.; Worn, H. (2002). Sensor-Aided Milling with a Surgical Robot System. CARS 2002, PardFrance. [Federspil03b] Federspil, Ph. A.; Tretbar, S. H.; Geisthoff, U.; Plinkert, B.; Plinkert, P. K. (2003). Ultrasound based navigation of robotic drilling at the lateral skull base. In: Lemke, H. E.; Inamura, K.; Diu, K.; Vannier, M. W.; Farman, A. G.; Reiber, J. H. C. (eds.): CARS 2003. Elsevier Science BV, pp. 1358. [Federspil03c] Federspil, Ph. A.; Stolka, Ph.; de Mola, C.; Geisthoff, U.; Henrich, D.; Plinkert, P. K. (2003). Kraftgeregelter Robotervorschub verhindert Hitzeschaden beim robotergestutzten Frasen an der lateralen Schadelbasis. CURAC 2003, Niirnberg/Germany. [Fuchsberger86] Fuchsberger, A. (1986). Untersuchung der spanenden Bearbeitung von Knochen. iwb Forschungsberichte Band 2, TU Miinchen, Institut f i r Werkzeugmaschinen und Betriebswissenschaften. [Hein021 Hein, A.; Kneissler, M.; Matzig, M.; Liith, T. (2002). Navigated Control - Ein neuer Ansatz fur das exakte Frasen. CURAC 2002, Leipzig/Germany. [Henrich021 Henrich, D.; Plinkert, P. K.; Federspil, Ph. A.; Plinkert, B. (2002). Kraft-basierte lokale Navigation zur robotergestutzten Implantatbettanlage im Bereich der lateralen Schadelbasis. VDI-Bericht 1679 - Tagungshandbuch zur Robotik 2002, Miinchen/Germany. [IPAOO] IPA Fraunhofer Institut fur Produktionstechnik und Automatisierung; U.R.S.; Universitatsklinikum Tiibingen (2000). Evaluierung von Prozejparametern f u r die roboterunterstutzte Mastoidektomie. [Plinkert98] Plinkert, P.K.; Baumann, I.; Flemming, E.; Loewenheim, H.; Buess, G.F. (1998). The use of a vibrotactile sensor as an artificial sense of touch for tissues of the head and neck. Minimally Invasive Therapy & Allied Technologies, Vol. 10, No. 6, 323-327. [StolkaOl] Stolka, Ph. (2001). Robotergestiitzte HNO-Chirurgie: Voruntersuchungen zum Sensoreinsatz. Project Thesis, Universitat Kaiserslautern, AG Robotik und Eingebettete Systeme (RESY). [Stolka02] Stolka, Ph. (2002). RONAF: Auswertung von Kraftsensordaten. Diploma Thesis, Universitat Kaiserslautern, AG Robotik und Eingebettete Systeme (RESY). [Stolka03] Stolka, Ph.; Henrich, D. (2003). A Hybrid Force-Following Controller for Multi-Scale Motions. SYROCO 2003, WroclawIPoland. [Waringo03b] Waringo, M.; Stolka, Ph.; Henrich, D. (2003). 3-Dimensional Layered Path Planning for Anytime Milling Applications. CURAC 2003, Nurnberg/Germany . [Waringo03c] Waringo, M.; Stolka, Ph.; Henrich, D. (2003). First System f o r Interactive Position Planning of Implant Components. CURAC 2003, Niirnberg/Germany .
ROBOTIC SURGERY IN NEUROSURGICAL FIELDHIROSHI ISEKI', YOSHIHIRO MURAGAKI', RYOICHI NAKAMURA', MOTOHIRO HAYASHI', TOMOKATSU HORI' AND KINTOMO TAKAKURA' Institute of Advanced Biomedical Engineering and Science and Graduate School of Medicine, Tokyo Women's Medical University, 8-1 Kawada-cho, Shinjuku-ku, Tokyo, 162-8666, Japan I SHIGERU OMORI].* Terumo Corporation R&D Center, I500 Inokuchi, Nakai-machi, Ashigarakami-gun, Kanagawa, 259-0151, Japan2
KOUJI NISHIZAWA'.3 Mechanical Engineering Research Luboratory, Hitachi Ltd., 502 Kandatsu, Tsuchiura, Ibaraki, 300-0013, Japan3 ICHIRO SAKUMA4 The University of Tokyo, 7-3-1 Bunkyo-ku, Tokyo 113-8654,Japan4
Information technology, visualization, and manipulation will be the key words for next generation neurosurgery. In order to let the surgical manipulation evolve from craftwork level to expert manipulation it is inevitable to establish a strategy desk based on targetoptimizing managing system which makes precise preoperative surgical planning and surgical course based on digital images as so-called a load map and leads the manipulator to the given destination. We are trying to construct a robotic surgery system which is installed with a 300-fold micro-endoscope, a gamma knife Model-C which irradiates solid-state micro-laser beams of 2.8 micro-m wave length, and an open magnetic resonance imaging (MRO-compatible micro-mamipulator. The gamma knife can ablate a "pin-point'' brain tumor (target) without disturbing the surrounding normal brain tissue because the penetrance of laser beam is as short as 100 micro-m compare to that of 500 micro-m of previous gamma knife Model-B. In addition the auto-positioning system of Model-C (C-APS) can precisely lead the laser beam to the target. The accuracy of the "pin-point attack" is of great advantage for patient and medical staffs. The ultimate goal of robotic surgery (C-APS) is to achieve the total resection of residual brain tumor located in or adjacent to the functional region aided by visualized anatomical, physiological, and functional information of pre- and intra-operative MRI.
* This work is supported in part by Health Science Research Grants (Research on
Advanced Medical Technology: H13-tiryou-004 in 2001-2003) of Ministry of Health, Labor and Welfare to H. Iseki. This study is also supported by Industrial Technology Research Grant Program in 2003 (A45003a) from the New Energy and Industrial Technology Development Organization of Japan to Y. Muragaki. 330
331
1.
Introduction
The history of surgery is comparable to the advances in tools, devices and systems for combat. Information technology, visualization, and manipulation will be the key words for next generation surgical treatment. In addition it becomes an important trend how to assure the safety and quality of medical treatment (especially surgery and following treatment). In order to achieve the quality control of treatment it necessary to perform a pin-point surgery exclusively on the target while the functional region around the target is preserved. In order to intraoperatively evaluate and support the ongoing surgery it is necessary to establish the system of log-management and navigation‘) which enables to know the site where the surgery is going on and the situation of that site, and a system that real-timely support a intraoperative decision. Like Iraqi war how to attack the set up target rationally under a strategic navigation by mobilizing tactical forces on the target. In surgical treatment a surgical planning is made preoperatively at head quarters. During surgery intraoperative events are monitored real-timely and when some trouble occurred, the optimum solution is presented to the surgeon by interpreting the situation promptly. Further, the present situation is correctly evaluated, a new surgical planning is made, and the next surgery (precision-guided surgery) is performed to lead the surgery to the initially set up goal.
2.
Advances in Computer-aided and Robotic Surgery
From late 1980s as the computer-aided surgery evolved a technology which supports surgery by use of computer for diagnosis. Nowadays the clinical application of integrated diagnosis, surgical planning, and navigation of computer-aided surgery became ubiquitous. At the same time when intraoperative magnetic resonance imaging (iMRI) was introduced, “real-time” up-date navigation of iMR images became popular as surgeon’s new eyes which visualize medical information. Consequently the accuracy of navigation surgery markedly improved. In 1990s surgical manipulators such as da VinciTM,ZeusTM, RobodocTM were introduced in the clinical field, thus the robotic surgery was incarnated. In neurosurgical fields we have developed a neurosurgical robot (hyper utility mechatronic assistant: HUMAN: Hitachi Ltd.). This neurosurgical robot was clinically applied at Sinsyu University under the name of Nuerobot by Professor Hongo for the first time in the world in August 2002”. By January 2004 his surgical team has experienced three cases of Neurobot surgery including third ventriculostomy. In order to perform next generation robotic surgery we are challenging to establish a strategy desk as a surgeon’s new brain and its practical use by the fusion of information technology and computer technology.
332 3.
Target Controlled Management (TCM) system by fusion of diagnosis and treatment
Time lag between diagnosis and treatment becomes almost zero owing to the advances in intraoperative diagnosis technology and it began the new era where the real-time treatment following the diagnosis is pursued. Accompanying with the development of this fusion technology, it becomes inevitable to establish a system which support a real-time decision for treatment intraoperatively. Medical information necessary for the decision is up-dated real-timely and given to the surgeon using the system. The outcome of the treatment is immediately feed backed and the optimum solution which can adjust ever-changing problems is offered to the surgeon. In a short time the course of treatment based on the quick diagnosis and its outcome is presented and the diagnosis immediately affects the treatment. At the same time the treatment is promptly evaluated. In prosecution of this system it is important to build up a TCM system which minimizes the gap between the outcome of planning and the actual situation by constructing circumstances that mimic the real world in virtual space, where ongoing events are input and analyzed real-timely and the results are showed on a road map which facilitate the real-time decision and consequently the achievement of initial treatment plan. The safety of surgery and treatment can be improved by recording precisely the log of surgery and following treatment digitally and by monitoring the surgical procedures and outcome real-timely using the ’). 4.
Robotic Surgery and Intelligent Operating Theater
In order to perform safe and precise surgery it is necessary to not only use imaging which presents morphological information necessary for surgery such as intraoperative imaging (MRI and computed tomography), but also exploit other imaging which can show functional and metabolic information. The iMR1 has a role to assist the surgeon’s decision for next surgical procedures by showing the present status real-timely. In order to compensate the deformation and shift of the organ due to surgical procedures preoperative images are not sufficient and it is necessary to up-date the navigation information using intraoperatively acquired images. The real-time update navigation and chemical navigation visualize residual tumor, so that the total resection can be achieved. Further with the concomitant use of diffusion-weighted imaging (DWI) navigation which visualize the motor nerve tract such as pyramidal tract it is possible to preserve the pyramidal tract during surgery. In case of craniotomy volumetric data of MRI, which is not quite useful for diagnosis, is of great use. Using the volumetric data bird’s-eye images and three-dimensional, cross-sectional view along the surgical
333
trajectory can be shown upon request of the medical staff. These surgical support using intraoperative images are a must to accomplish the safe and accurate surgery. On the console of the MRI device in our intelligent operating theater systems (AIRIS IIm :Hitachi Medical Corporation ) such as the system of the collection, log management, and deliver of intraoperative medical information, and the system of tele-supporting (tele-monitoring, tele-instruction, and teleconference system) are equipped. Main functions of these systems include the function of two-way communication with the strategy desk of surgery (including tele-conference system), the streaming function of surgical information to the set of personal digital assistance in the operating theater, the function of surgery supporting like navigation, and the function of medical traceability. In future the unique information for individual medical staff will be delivered exclusively to that particular staff upon request. As the robotic surgery system advances another function of need will be further installed one after another. In other word this is a treatment strategy system with which “visualized” information is shared with the teams and medical treatment is carried out by the teams. In this system prevention, diagnosis, and treatment are lumped together under the control strategy desk which is linked up with integrated information management desk. In our open MRI operating theater, Le., intelligent operating theater, 203 surgeries were performed from March 2000 to February 2004. Meantime we are continuously trying to develop surgery supporting devices which should ensure safe and accurate surgery and to improve surgical planning, surgical procedures, and evaluation criteria based on the precise diagnostic imaging.
5. Computer-aided Design and Computer-aided manipulation (CADCAM) Laser System for precision-guidedsurgery We have experimentally manufactured a device that enables laser ablation of residual tumor seated at the bottom or lateral wall of the tumor cavity without contact of the laser device by combining iMRI and “real-time” or chemical navigation. An area to be treated was determined when observing the target area using charge-coupled device camera. Then the lesion was ablated by evaporation etching highly efficiently using a developed mid-infrared, continuous wave laser with its local thermal action. In order to ablate residual tumor tissue the positioning was made with less than 0.5 mm accuracy and the evaporation etching is controlled within 0.1 mm to 0.2 mm in depth and width. As a laser wave length which exhibits the best absorption by the brain tissue 2.8 micro-m was chosen. Micro-laser, in combination with the surgical strategy system and the intraoperative imaging, is of choice for the “pin-point attack” because the penetrance is as short as 100 micro-m or less. The ultimate goal is to achieve the total resection of residual brain tumor located in or adjacent to the functional
334 brain region using the precision-guided operating system that uses the microlaser and micro-endoscope aided by visualized anatomical and physiological information (Fig. 1).
Figure 1. A micro-laser ablation system for precision-guided surgery
6.
MR Compatible Micro-manipulator
In order to let the surgical manipulation evolve from craft work level to expert manipulation it is inevitable to establish a strategy desk based on target optimizing managing system which makes precise preoperative surgical planning and surgical course based on digital images as so-called a load map and lead the manipulator to the given destination. Upon completion of this system microendoscopic surgeries will increase in number. Third ventriculostomy for hydrocephalus, biopsy, and small size tumor resection is good indication. This is a surgical system in which the low invasiveness that is an advantage of neurendoscopic surgery and operationability and reliability that are advantages of manipulator is However, it is necessary to analyze the surgical process for indicated disease before the new surgical device and system is used for treatment. This is an important process to verify if the new procedure and device have advantages over conventional ones en masse. When considering the surgery as a production line a partial optimization of the procedure and surgical device should make bottlenecks at the different part of the system. Therefore the
335 optimization of the procedure and device as a whole is absolutely necessary. It is necessary to find the optimum access root and to save minimum required space for surgery and to exam the shape and function of necessary devices which are essential to accomplish the surgery based on the three-dimensional image information of the lesion area of interest. At the same time simulation of surgical procedure aided by the three dimensional positioning information which shows the planned surgical process must be done and analyzed along the surgical scenario using the three-dimensional CAD. During the simulation the surgeon must dig out possible problems or bottlenecks special to the given surgical field. Especially it is necessary to disclose problems such as the problem of how to secure the best access space to surgical devices under physical restriction, the problem of how to preserve the functional region when approaching to the target, and the problem of how to carry out surgery and how to manipulate the target. The sequence of simulation action is visualized as a time-series road map and in the actual surgical site the surgery will be completed safely and precisely by keeping the gap between ongoing events and the pre-determined road map to a minimum.
7.
Gamma Knife Model C Auto-positioning system (C-APS)
Gamma knife is an instrument that radically cures the brain lesion using gammaray as if a conventional surgical knife would cut out the lesion, without disturbing the surrounding normal brain tissue. With the old model B the minimum accuracy was 0.5 mm because putting a helmet on the patient’s head and positioning the each target were carried out manually. However, small devices of mortar system was installed at both (left and right) side of inside of the helmet in Model C-APS (Elekta K.K.), so that it became possible to adjust positions to all the target automatically with 0.1 mm accuracy by simply wearing the helmet. Therefore this system enables less invasive and safer treatment circumstances for patients by reducing temporal and physical strains irrespective of number of shots. This robotic surgery system (C-APS) advantageously reduces these strains considerably for medical staffs as well (Fig. 2). By February 2004 we have experienced 3000 cases of gamma knife surgeries including 500 cases of robotic surgery using C-APS. Owing to the CAPS treatment accuracy was increased to 0.1 mm range and treatment duration was shortened. Longer irradiation times for one shot, however, caused a big problem regarding patient management. We adjusted this problem by introducing tele-anesthetizing and tele-monitoring system and the robotic surgery system became more safe and accurate system.
336
computer-aided planning
8. Discussion When the total resection of malignant glioma located adjacent to or in the eloquent area is achieved safely and precisely recurrence should be prevented and five-year survival rate should be improved. However, there are discrepancies among reports with respect to the effect of resection rate on prognosis. According to Lacroix et aL5’ the five-year survival rate of primary glioblastoma multiforme was good when the resection rate was 98% or more. When including recurrent cases the survival rate was significantly better in group with 87% or more tumor resection than in group with less than 87% resection. Similarly Brain Tumor Registry of Japan revealed that five-year survival rate increased from 21% after 95% resection to 41% after 100% ablation6’. We have achieved over 91% resection rate in average by using the intraoperative open MRI and the navigation system. However, we are not able to make any conclusion yet about the relationship between resection rate and prognosis because the average follow-up duration in our cases is only two years and three months. Hess7’has well documented regarding the effects of resection rate on prognosis. In order to accomplish the 100% resection rate it is necessary to resect brain tumor with 0.5 mm accuracy. The accuracy of conventional surgical procedures, however, has limitation because the approaching of surgical devices to the tumor lesion is controlled manually. The robotic surgery system will
337 overcome above mentioned problem. The robotic surgery system will be of great use because it can manipulate surgery safely with 10 micro-m precision and can record the log of surgical process. Especially the fact that the surgical process is recorded with time plays an important role in the evaluation of surgical process after surgery and validity of surgical planning, and in the analysis of accuracy. Although the operationability of manipulator with 10 micro-m precision ensures safe and accurate surgery, the robotic system can not be used for the ablation of large size tumor. The future of robotic surgery depends on how to use the advantages of robot. The essence of robotic surgery is the accurate treatment based on the clear distinction between the tumor lesion and normal tissue. And the advantages of the robotic surgery can be converged in the following three points: (1) it can carry out a surgical process that can not be done using conventional surgical device, (2) it can manipulate a complicated surgical process in a limited space, and (3) it can perform a surgical process precisely according to the design made by computer.
Acknowledgement We acknowledge the intensive alliance of the staff of Hitachi Medical Cooperation with respect to the construction, administration, and maintenance of our intelligent operating theater.
References 1. K. Hongo, S. Kobayashi, Y. Kakizawa, J. Koyama, T. Goto, H. Okudera, K. Kan, M. Fujie, H. Iseki, K. Takakura, Neurosurgery 51,985 (2002). 2. H. Iseki, Y. Muragaki, K. Nakamura, M. Hayashi, T. Hori, K. Takakura, Proc. Comput. Graph. Internatl. 2003,44 (2003). 3. N. Miyata, E. Kobayashi, D. Kim, K. Masamune, I. Sakuma, N. Yahagi, T. Tsuji, H. Inada, T. Dohi, H. Iseki and K. Takakura, In T. Dohi and R. Kikinis (Eds.), MICCAI 2002, LNCS 2488, pp107 (2002), Springer-Verlag Berlin Heidelberg. 4. Y. Koseki, T. Washio, K. Chinzei, H. Iseki, In T. Dohi and R. Kikinis (Eds.), MICCAI 2002, LNCS 2488, ppl14 (2002) Springer-Verlag Berlin Heidelberg. 5. M. Lacroix, D. Abi-Said, D. R. Fourney, Z. L. Gokaslan, W. Shi, F. DeMonte, F. F. Lang , I. E. McCutcheon, S. J. Hassenbusch, E. Holland, K. Hess, C. Michael, D. Miller and R. Sawaya, J. Neurosurg. 95, 190 (2001). 6. The Committee of Brain Tumor Registry of Japan: Report of brain tumor registry of JAPAN (1969-1993) 10th edition. Neurologica medicochirurgica. Suppl. 40,54(2000). 7. K. R. Hess, J. NeurooncoZ. 42,227 (1999).
FROM THE LABORATORY TO THE OPERATING ROOM: USABILITY TESTING OF LER, THE LIGHT ENDOSCOPE ROBOT
P. BERKELWIAN, E. BOIDARD, P. CINQUIN AND J . TROCCAZ T I M C - I M A G Laboratory Institut de l'lnge'nierie de l'lnjorrnation de Sante' Faculte' de Me'decine 38706 La Tronche, France E-mail: Peter.Berke1manOimag. f r
The LER is a compact endoscope manipulator for minimally invasive surgery which eliminates the need for a human assistant t o hold the endoscope and provides hands-free user interfaces t o control orientation and insertion depth. Ongoing testing during procedures on cadavers and animals have enabled us to improve the design of the LER t o address concerns related t o ease of use, versatility, cleaning, and resistance to sterilization procedures. These refinements are made in view of planned human clinical trials of the LER.
1. Introduction
We have developed a surgical assistant robot to hold and manipulate an endoscope during minimally invasive surgery. The novel features of this endoscope robot are its simplicity and small size, which confer significant advantages in ease of use, safety, and versatility compared to current endoscope manipulators which hold the endoscope with a robotic arm extended over the patient and have a large, heavy base resting on the floor or mounted to the operating table. The LER is being tested and evaluated regularly by assisting surgical interns performing training procedures on cadavers and animals. The results of this testing have enabled us to progressively refine the design and operation of the LER with minor modifications to further improve aspects of the device such as its versatility, stability, safety and the ease of setup, operation, mainenance, and cleaning. 33%
339 2. LER
The LER consists of an annular base placed on the abdomen, a clamp to hold the endoscope trocar, and two joints which enable azimuth rotation and inclination of the endoscope about a pivot point at the incision. A conipression spring around the endoscope shaft and a cable wrapped around a spool control the insertion depth of the endoscope. Small brushless motors actuate each motion degree of freedom. Control of the robot is simple and straightforward, as the motion of each motor directly corresponds to horizontal and vertical motion and the zoom of the endoscope camera ima.ge. No kinematic calculation, initialization procedure, or homing sequence is necessary for operation of the robot. The latest prototype of our endoscope robot is sterilizable so that plastic draping for sterility is unnecessary and it may be autoclaved with other surgical equipment. A schematic of the most recent prototype of the LER is shown in Fig. 1 and further details of its design and operation are described in 3 . Earlier prototypes are described in 472.
3. Hygiene Issues Any medical device which is brought into the proximity of the site of a procedure performed on the body of a patient must be completely sterile to avoid any risk of infection. Sterility of these devices can be achieved by sterilization through elevated heat, humidity, and pressure in an autoclave, by using disposable devices in sealed packaging, or by carefully covering exposed non-sterile parts of the device with disposable sterile plastic sheeting. A combination of these methods can also be employed, such as the case of motorized devices where the actuators are draped and the end effectors are disposable or sterilizable, for example. The LER has been designed to be sterilizable by autoclave so as to eliminate the need for sterile drapes. This serves to simplify and reduce the time required for setup of the LER, and eliminate the risk of contamination due to torn or misplaced drapes and the cost of custom formed disposable drapes. The motors, reducers, and connectors in the LER are certified to be autoclaveable by the suppliers, all wiring connections and cabling are sealed in silicon materials, and the mechanism marts are fabricated from stainless steel and hard anodized aluminum materials which resist corrosion and surface degradation due to repeated autoclave cycles. Due to increasing concerns regarding prion diseases such as CreutzfeldtJakob disease (CJD), current sterilization protocols require that surgical
340
Figure 1. Light Endoscope Robot schematic
instruments must be immersed in a caustic solution for 20-30 minutes in addition to the autoclave cycle. This requires the LER motors to be watertight as well as autoclaveable so that the internal wiring and electronics of the motors are not damaged by leakage of the caustic solution. 4. Testing
Clinical trials of other endoscope robots such as the AESOP” and EndoAssistb have indicated that the safety and effectiveness of various surgical afrom Computer Motion/Intuitive Surgical Inc., Mountain View CA, USA bfrom Armstrong Healthcare Ltd., High Wycombe, England.
341
Figure 2.
Light Endoscope Robot in use on cadaver
procedures are not adversely affected by using these endoscope robots in place of a human endoscope holder assistant, and the stability of the endoscope camera image is i m p r ~ v e d ~ , ~ > ~ > ' . In order to prepare for clinical trials on patients and improve its integration and ease of use in an operating room environment, the LER has been used on a regular basis by surgeons during minimally invasive surgical training procedures on cadavers and animals. The LER is shown in w e on a cadaver in Fig. 2. Two aspects in particular that have been evaluated during testing are the means of fixation of the LER on the patient and the user command interface. To fix the location of the robot on the abdomen it was found that a small articulated arm clamped to the table was preferable to attachment by flexible straps or adhesive sheets, although suturing the robot to the abdomen was also found to be adequate in certain cases. Most users preferred a voice recognition command interface to buttons or pedals, even if the response was slightly delayed. A miniature keypad attached to a surgical instrument performed well, but adds the inconvenience of having to change the keypad attachment whenever the instrument is changed. Further testing was done using a electrocautery instrument. The tip of
342 the activated instrunient was repeatedly brought in contact with the LER while on the abdomen of a cadaver to test if similar errors by a surgeon would result in any damage to the LER or injury to the patient. The electrical arcs produced almost imperceptible cosmetic burn spots on the surface of the LER but did not damage the motors or control electronics. Slight burns were produced on the skin of the cadaver whenever the electrical contact between the base of the LER and the abdomen was poor, such as when the only contact was along the edge of the base ring.
5. Modifications
The motion speed of the LER in different directions was adjusted according to the preferences of surgeons who have used it. Based on feedback from these surgeons, a hook attachment was added to the insertion cable to simplify the removal of the endoscope to clean the lens, and a command was added to the voice recognition interface to switch off the motors and enable manual repositioning of the LER. It was found that the clamp at the end of the articulated arm which held the LER in position could potentially obstruct motions of the surgeon, the instruments, or the endoscope in certain configurations. To prevent any obstruction, a 20 cm panhandle-shaped bar was fabricated to serve as an adapter between the LER and the articulated arm and increase the distance between the clamp and the LER. Current plans for further modifications include adding watertight seals for the motor shafts and simplifying disassembly to improve the convenience of cleaning and sterilization of the LER. A direct path to electrical ground will also be provided from the metallic parts of the LER through the articulated arm in order to prevent any possibility of damage to the motors or burns to the patient in case of contact by electrocautery instruments. The possibility of bodily fluids penetrating the ball bearing and gear teeth inside the base of the LER remains an important concern, as these areas are difficult to access with a brush for cleaning even when the base is disassembled. An immediate solution is to enclose the base of the LER in a waterproof bag with a hole for the trocar, to prevent penetration of fluids during procedures. Another potential solution would be to remove the outer and a portion of the the upper walls enclosing the gear teeth and ball bearing inside the base of the LER, thereby permitting sufficient access to the interior with a fine brush for cleaning.
343 6. Conclusion
We have established a new approach for small and simple surgical assistant robots in minimally invasive surgery. We are continuing to refine the design of the LER through testing in procedures on cadavers and animals and in consultation with surgeons and nursing staff. The most recent niodifications were proposed to address concerns for easier cleaning and improved resistance to current sterilization procedures.
Acknowledgments This work is sponsored by the RNTL and ANVAR through the MMM project. Support has also been provided by PRAXIM S.A. and CNRS. The robot mechanism was fabricated by Alpes Instruments S.A. of Meylan, France. The cooperation of the digestive and urological surgical staffs at the University Medical Center in Grenoble has been an essential contribution to this project.
References 1. S. Aiono, J . M. Gilbert, B. Soin, P. A. Finlay, and A. Gordon. Controlled trial of the introduction of a robotic camera assistant (EndoAssist) for laparoscopic cholecystectomy. Surgical Endoscopy, 16(9):1267-1270, September 2002. 2. P. Berkelman, P. Cinquin, E. Boidard, J . Troccaz, C. Lktoublon, and J.-M. Ayoubi. Design, control, and testing of a novel compact laparoscopic endoscope manipulator. Proceedings of the Institution of Mechanical Engineers Part I: Journal of Systems and Control Engineering, 217(4):329-341, 2003. 3. P. J. Berkelman, E. Boidard, P. Cinquin, and J. Troccaz. LER: The light endoscope robot. In International Conference o n Intelligent Robots and Systems, pages 2835-2840, Las Vegas, October 2003. IEEE/RSJ. 4. P. J. Berkelman, P. Cinquin, J. Troccaz, J. Ayoubi, C. Lktoublon, and F. Bouchard. A compact, compliant laparoscopic endoscope manipulator. In International Conference o n Robotics and Automation, pages 1870-1875, Washington D.C., May 2002. IEEE. 5. K . T. den Boer, M. Bruijn, J. E. Jaspers, L. P. S. Stassen, W. F. M. van Erp, A. Jansen, P. M. N. Y. H. Go, J. Dankelman, and D. J. Gouma. Time-action analysis of instrument positioners in laparoscopic cholecystectomy. Surgical Endoscopy, 16:142-147, 2002. 6. L. R. Kavoussi, R. G. Moore, J . B. Adams, and A. W. Partin. Comparison of robotic versus human laparoscopic camera control. Journal of Urology, 154:2134-2136, 1995. 7. L. Mettler, M. Ibrahim, and W. Jonat. One year of experience working with the aid of a robotic assistant (the voice-controlled optic holder AESOP) in gynaecological endoscopic surgery. H u m a n Reproduction, 13:2748-2750, 1998.
ROBOTIC AND LASER AIDED NAVIGATION FOR DENTAL IMPLANTS T. M. BUZUG’, U. HARTMANN, D. HOLZ, G. SCHMITZ AND J. BONGARTZ Department of Mathematics and Technology, RheinAhrCampus Remagen, Sudallee 2, 0-53424 Remagen, Germany P. HERING AND M. IVANENKO Holography and Laser Technology, Stiftung caesar, Ludwig-Erhard-Allee 2,D-53175 Bonn, Germany G. WAHL AND Y. POHL Poliklinik fur Chirurgische Zahn-, Mund- und Kieferheilkunde, University Dental Clinic Bonn, Germany
Thc paper presents the project goals and first results of the network projcct ROLANDI (Robotic and Laser-Aided Navigation for Dental Implants). The combination of image guidance and laser surgery is a promising approach in dental implantology. The main advantage compared to conventional drilling is the contactlcss ablation process that diminishes residual movements of the patient. However, the accuracy of the entire registration chain - from the CT imaging via optical navigation to the positioning precision of the robotic laser tool holder - has to be investigated in the course of thc project. We will present the methodology for the error propagation estimation and a novel laser-based procedure to obtain a ground truth.
1.
Introduction
This paper introduces a research network consisting of RheinAhrCampus Remagen, Caesar Bonn and University Dental Clinic Bonn that is currently established in the field of robot assisted surgical laser interventions. The main project focus is an accuracy study for laser surgery in dental implantology. We will outline the main ideas and some preliminary results of the project “Robotic and Laser Aided Navigation for Dental Implants” (ROLANDI). The project is embedded in the Center of Expertise f o r Medical Imaging, Computing and Robotics - CeMicrot - being a research center for medical image acquisition
Contact:
[email protected], tel. +49 (0) 26 42 / 932-3 18, fax +49 (0) 26 42 / 932-301 CeMicro is partially supported by the Ministry for Economy, Transport, Agriculture and Viniculture of Rhineland-Palatinate, Germany 344
345 technology, medical signal processing and computation, and the improvement of clinical procedures in image-guided interventions with robotic assistance. The application of computer-aided planning, navigation and robotics in dental surgery provides significant advantages - compared to conventional practice - due to today’s sophisticated techniques of patient-data visualization in combination with the flexibility and precision of novel robots. However, a realization of navigation and robot assistance in a field where only local anesthesia is applied is a challenging task because of unavoidable patient movements during the intervention. In this paper we propose the combination of image-guided navigation, robotic assistance and laser surgery to improve the conventional surgical procedure. A key issue for the success of the surgical intervention is the registration of the planned operation trajectory with the real trajectory in the OR. In the course of the project this will be thoroughly investigated. As a preliminary study we have focussed on the registration error in a point-based approach. The pointbased registration is used in a large number of systems and is therefore chosen as standard technique in our project. Other matching strategies such as surfacesurface or palpation based registration will be evaluated with respect to the results of the point-based registration. To evaluate the overall accuracy of the entire registration processing chain we have to estimate the localization error in the underlying image modality, the localization error of the fiducial or anatomical markers in the OR and the positioning error of the surgical tool. In our case this means a calculation of error propagation from the CT via the optical navigator to the robotic laser holder. We start the investigation with an anatomical phantom equipped with fiducial landmarks. To obtain a ground truth a holographic image of the phantom is taken that enables marker localization with a very high precision. A second work package of the project deals with laser intervention. Laser surgery is a smart technique, being advantageous whenever holes in small ridgelike bone structures are needed. However, our main goal is to reduce the efforts for patient fixation. Laser “drilling” is a contactless process, and therefore, in comparison with the conventional drilling, the jaw is not subjected to forces. As a consequence, the online tracking and re-registration of the patient has to cope with small residual movements only, even in the case of non-invasive patient fixation. On the other hand, a major drawback in surgical laser drilling is the loss of tactile feedback, which in fact is the unavoidable consequence when no forces are applied. Thus, no direct information about the drilling depth is available. This is a critical issue, because vital neighbouring structures as canalis mandibulae with the alveolar nerve must not be damaged during the intervention.
346 2.
Equipment and Experimental Setup for Image-Guided Robotics
Beside the wide range of image acquisition devices, CeMicro at RheinAhrCampus Remagen is equipped with the robotics tools for active handling in the OR. Robots can lead to health care improvement in several fields of medicine. The modem robots operate accurately, fast and with no lack of concentration. The OR Technology and Medical Robotics Laboratory are equipped with 2 Mitsubishi RV-E4NM and a Reis RV6 robotic system. The Mitsubishi robots RV-E4NM have a maximal payload of 4 kg and repeatability of k0.03 mm. These robots weigh only 50 kg and are simple in operation. The other robot, Reis RV6, is larger; it has a maximal payload of 6 kg and repeatability of k0.05 111111. The CeMicro robot laboratory is set up in a complementary project [I]. An optical tracking system serves as the ‘eyes’ of the robots and any other surgical instruments. It is capable of tracking the locations of the patient and medical instruments. Our laboratory is equipped with NDI Polaris hybrid optical tracking systems as well as a Philips OPMS tracking system. Experimental physical models of real patients are scanned in Philips CT Secura, Siemens Somatom AR. T orland General Electric Prospeed Advance computer tomographs several times. A computer model is obtained as a stack of 2D medical images. During the segmentation processing the desired details are extracted. As a next step, the 3D surface model is reconstructed as a triangular mesh.
3.
Error Sources within Registration
A critical point for the success of the surgical intervention is the registration of the planned operation trajectory with the real trajectory in the OR. In the course of the project this point will be the main focus. As a preliminary study we investigate the registration error in a point-based registration. The point-based registration is used as a standard in our project. Different matching strategies as surface-surface or palpation based registration will be evaluated with respect to the point-based case.
3.1. Mathematical Basics In first experiments we used A4 fiducial markers as well as anatomical landmarks that are homologous point sets {p,, p2, p3, ... pN} in CT-space and {q,, q2,q3, ... qM} in OR-space. The rigid body transformation q=Ap-b
(1)
347 where AeSO(3) is a rotation matrix and b a translation, transforms CT-image points to the OR coordinate system. Both sets of points are acquired with certain inaccuracies and it is easy to see where they come from. Fig. l a shows a skull phantom in the CT laboratory at RheinAhrCampus Remagen inside the GE Prospeed Advance Tomograph. Six fiducial markers (4 on the top and 2 on the back) are mounted onto the skull. To acquire CT data for the error study many scans with different protocols, phantom orientations and gantry tilts have been performed. In fig. l b the result for a pitch one spiral scan with 3 mm slice thickness is shown. The variability of the positions determined in a segmentation step (fig. lc) is mainly induced by partial volume averaging in the anisotropic CT data sets and can be expressed in terms of the covariance matrix
where E(p) is the mean value vector
for a certain marker at position p that is measured N times.
Figure 1. (a) CT scanner at RheinAhrCampus Remagen. A skull phantom equipped with fiducial markers is scanned with different protocols (different slice thickness in conventional modus and different pitches in spiral modus) as wcll as differcnt phantom orientations in combination with diffcrent gantry tilts. One rcsult for a pitch onc spiral scan with 3 mm slice thickness is given as a 3D rendering in (b). 6 segmented fiducial markers can be scen in the semi-transparcnt visualization of the skull (c).
Unfortunately, the localization error in the CT image is not the only source of inaccuracy in the registration process. The second error source is the localization error of the fiducial markers in the OR. We know from the technical booklet of the NDI Polaris navigation system that there is a system inherent inaccuracy of 0.35 mm. However, this does not take into account the errors that
348 are introduced by the surgical interaction. Therefore, in a first step we will ask for the repetition accuracy of fiducial marker localization in the OR. In fig. 2 it is shown how the interaction inaccuracy is measured in the OR. The tip of the pointer device is positioned on a certain fiducial marker and the position is monitored with 60 Hz while the pointer is moved around on a sphere. Similar to the CT image the variability of the positions of the fiducial markers in the OR can be expressed in terms of the covariance matrix
where E(q) is the mean value vector
for a certain marker at position q that is measured L times. Fig. 2 shows that the covariance matrices can be geometrically interpreted as error ellipsoids. Due to the fact that we are faced with errors in both coordinate systems, we have to estimate the transformation between the systems by minimizing M
x2= c
(6)
1-1
Figure 2. To measure covariance matrices for the fiducial markers - that can be visualized as ellipsoids indicating confidence intervals the tip of the pointer device points onto a certain markcr. The position data are acquired with the navigation system when the pointer device is moved on a sphere. ~
The mathematics of the matching procedure is the same when we use anatomical landmarks instead of fiducial markers. This is especially important for dental application scenarios, because the fiducial markers are glued on the
349
skin and not on the skull and are therefore subjected to small skin shifts when the patient moves e.g. opens his mouth. The only rigid body basis that are available for anatomical landmark based registration are the teeth. The teeth can be seen in the CT and can be identified and touched by the pointing device in the OR. Unfortunately, in most cases a new problem arises due to typical metal artefacts in the CT [2]. 3.2. Ground Truth To obtain the ground truth a hologram of the phantom is taken with a holographic camera as shown in fig. 3. The GEOLA GP-2J camera system comprises a flash lamp pumped Nd:YLF laser oscillator with subsequent amplification and second harmonic generation.
Figure 3. Holographic camera system for recording a hologram of the phantom (left).Reconstmction setup to localize the fiducial markers in the holographic real image (right).
The resulting laser-pulse has a wavelength of 526 nm, duration of 25 ns and maximum pulse energy of 2 J. Careful mode selection leads to a coherence length of above 6 m allowing recordings within a large volume of several cubic meters. A typical object distance is approximately 60 cm. Illuminating the resulting hologram with the complex conjugated reference beam of the recording setup by means of an equivalent continuous wave laser system reconstructs the so called real image of the hologram, which corresponds to a three-dimensional image of the phantom with identical scale. The according reconstruction setup is sketched in the right hand side of fig. 3. The resolution of the real image depends on the aperture and therefore on the size of the hologram. Using holographic plates of 40 c m x 30 cm a resolution below 5 microns is achievable [6]. The real image is investigated by a CCD-array mounted on a three axis linear stage, which can be moved contactless and with all degrees of freedom, since the real image is just a threedimensional projection of the phantom. The CCD-array (Kodak KAI-4000) has 2048 x 2048 pixels of size 7.4 ym x 7.4 pm corresponding to an active area of
350
15.2 mm by 15.2 mm. The high resolution of the CCD-array allows the identification of microscopic features in the real image providing a localization of suitable fiducial markers of the phantom with high precision. We aim for an overall accuracy of marker localization below 10 microns. A hologram is capable of recording all parts of the object, which are directed towards the holographic plate. To capture markers on the back of the phantom additional holograms of different perspectives are necessary to determine the relative positions between all markers. 4.
Laser Intervention
A second work package of the project deals with laser surgery. The laser procedure can bring advantages whenever holes in small ridge-like bone structures have to be prepared. Our main goal is however, to reduce the efforts for patient fixation. The laser "drilling" is a contactless process, and therefore, in comparison with the conventional drilling, the jaw is not subjected to mechanical forces. As a consequence, the online tracking and re-registration of the patient has to cope with small residual movements only, even in the case of non-invasive patient fixation. A basis for the development of a laser drilling procedure provide investigations on laser osteotomy with pulsed COZlasers done at the last years (see [3-51 and Refs. therein). The goal of these works was a fast laser "processing" of hard tissue without undesirable thermal side effects. It can be reached if the deposited laser energy is confined to a thin absorption layer and effectively used for a thermomechanical ablation of the hard tissue. The ablation process has to proceed at possibly low temperature and has to be faster than heat diffusion in the tissue. 4.1. Preliminary Results
Due to the very high peak intensity the ablation starts very quickly (time delay of -10 ns) and the heat diffusion has no time to proceed (temperature relaxation time constant is about 100 ps). Other side, the material removal rate with these short pulses is below 10 pdpulse. Only part of the laser pulse energy is used for the ablation. A dense ablation debris absorb "immediately" laser light and shields the tissue from the beam. Our experiments show that this system can be used to produce deep (up to 10 mm) and narrow (200 pm) holes or incisions, as well as pits of different shapes in the bone (fig. 4). However, the drilling of relatively big holes for tooth implants will be rather time consuming. To remove 1 cm3 of the compact tissue one need more than 10 kJ of the laser energy. At the same time, the average laser power is limited to 3 W.
351
Figurc 4. 3-D rcmoval of compact bonc tissuc (bull femur) with mini TEA CO? lascr (Icft). Examplcs of dccp holcs drillcd with thc slab COz lascr undcr application of a spccial bcam scanning proccdurc in compact bonc. Irradiation paramctcrs: h = 10.6 pm, T,/?= 80 ps, E = 80-90 mJ, f = 100-200 Hz, duration of thc drilling up to 5 min, dcpth up to 16 mm (right).
Much faster processing of bone tissue is possible with a powerful “slab“ C02 laser. It provides the energy up to 90 mJ in pulses of 80-ps-duration (FWHM). The repetition rate goes up to I0000 Hz. For the laser osteotomy repetition rates ranging from 50to 1000 Hz are used. The beam can be focused down to the spot size of 120 pm without optical breakdown. The corresponding energy density amounts up to 600 Jlcm’ipulse by the peak intensity of 8 MW/cm2. The system is coupled through an articulated mirror-arm and matching optic with a fast galvanic X-Y beam deflector, which is equipped with a plane-field focusing lens. The laser and the deflector are controlled by a PC. With this system we reach for compact bone similar cutting rates as with mechanical tools. For example, at average power of 70 W the cut rate of 70 d m i n by cut depth of 6 mm has been demonstrated [4]. The laser leaves no indication of carbonization in the tissue. Histological evidence of thermal effects can be found only in a 50-pm-broad zone adjacent to the cut boarder [ 5 ] .Other key issues of the project are coupling of the laser system with a robot-arm and navigation system. In initial trials we plan to transmit the laser beam to the robot with an articulated mirror-arm or flexible light guide. As a final technical solution an integration of the laser in the robot is aspired. An execution of very fast and fine beam motion, which is necessary to drill the implant cavities, is not optimal task for a robust robot-arm. That is why we plan to develop a compact end-applicator with an integrated beam deflector, which takes control over the laser beam when the robot is exactly positioned relative to the jaw bone. 4.2. Control of surgical laser drilling
A drawback of the contactless surgical laser drilling is the lack of tactile feedback. Thus, no direct information about the drilling depth is available. This
is a critical issue, because vital neighbouring structures as canalis mandibulae with the alveolar nerve must not be damaged during the intervention. Therefore some new method of precise depth control must be developed. We assume, that information about the drilling depth and type of the tissue can be extracted from the acoustic signal accompanying the drilling. The source of the signal is the laser ablation process at the bottom of the bore. The laser induces acoustic pulses with very short rise time. With corresponding techniques one can measure change in the runtime of the acoustic pulses and thereby determine the position of the bore bottom. At present this technique is under development by the center of advanced European studies and research. The laboratory depth measurements on tissue models have demonstrated precision o f f 100 pm. Further, we intend to prove that the sound reflection on the boarder of canalis mandibulae can be reliably detected. In this way the information about the distance between the bore bottom and the nerve channel can be directly assessed.
Acknowledgments The authors thank the Ministry for Economy, Transport, Agriculture and Viniculture of Rhineland-Palatinate, Germany, for granting CeMicro.
References G. Gubaidulin, A. Zimmermann und R. Evbatyrov, Kompensation von Patientenbewegungen bei navigierten und roboterisierten kieferchirurgischen Eingriffen. In: Physikalische Methoden der Laser- und Medizintechnik, edt. by Th. M. Buzug et al., Fortschritt-Berichte VDI 17 (231), 103 (2003). 2. Th. M. Buzug, Einfirhrung in die Computertornographie,Springer (2004). 3. M. M. Ivanenko, P. Hering. Wet bone ablation with mechanically Qswitched high-repetition-rate COz laser, Appl. Phys. B 67, 395 (1998). 4. S. Afilal, M. Ivanenko, M. Werner, P. Hering, Osteotomie mit 80 ps C02Laserpulsen. In: Physikalische Methoden der Laser- und Medizintechnik, edt. by Th. M. Buzug et al., Fortschritt-Berichte VDI 17 (231), 164 (2003). 5 . M. Frentzen, W. Gotz, M. Ivanenko, S. Afilal, M. Werner, P. Hering, Osteotomy with 80-ps C 0 2 laser pulses - histological results, Lusers in Medical Science 18, 119 (2003). 6. J. Bongartz, Hochaujlosende dreidimensionale Gesichtsprofilvermessung mit kurzgepulster Holographie, PhD thesis, Mathematisch-Naturwissenschaftliche Fakultaet der Heinrich-Heine Universitaet at Duesseldorf, 2002, http://deposit.ddb.dekgi-bin/dobew?idn=964966670. 1.
SURGICOBOT: SURGICAL GESTURE ASSISTANCE COBOT FOR MAXILLO-FACIAL INTERVENTIONS ERIC BONNEAU Service Robotique et Systdmes Interactif, CEA-List, Centre de Fontenay-aux-Roses BP 6 92265 Fontenay-am-Roses Cedex France
FARID TAHA Service de Chirurgie Maxillo-Faciale. CHU Amiens, CHU Nord 80054 Amiens France
PHILIPPE GRAVEZ, SYLVIE LAMY Service Robotique et Systdmes Interact$, CEA-List, Centre de Fontenay-aux-Roses BP 6 92265 Fontenay-aux-Roses Cedex France
Our paper describes the development of a surgical COBOT (COllaborative roBOT) demonstrator and its validation on a maxillo-facial surgery application. The basic purpose of a surgical COBOT is to help surgeons to avoid harming sensitive anatomical structures that are only seen on pre-operative imagery. The kernel of the system is a haptic robotic arm featuring both excellent transparency and a capability to generate forces in the hand of its operator. A standard surgical instrument, a maxillo-facial drill in this case, is fixed at the extremity of the haptic arm but is hold by the surgeon in the same manner as in a manual surgery configuration. The other essential components of the system are an advanced haptic controller that implements the desired behaviours of the haptic arm and a virtual reality engine able to compute force constraints based on a simulation of the interactions between the surgical instrument and the patient modelled anatomy. Obviously, the system must also embody a calibration function to match pre-operative models with the per-operative real anatomical features. Using the COBOT, the surgeon can freely handle his surgical instrument except in the volumes defined beforehand as protected anatomical zones. The COBOT then restrains the surgeon penetrating these zones and thus actually assists the safe performance of delicate surgical operations by actively protecting anatomical structures like the spinal cord or a nerve in the mandible. This concept clearly combines several technologies, mainly force-feedback robotics, virtual reality and exploitation of 3D models constructed from medical imagery of patient. It possesses a huge development potential, since by merging repulsive and attractive guidance forces and by trimming their values, a COBOT may become a multifunctional tool able to assist surgeons for a wide range of applications.
1.
Introduction
The developments and experiments described in this paper aim at assessing the capabilities of surgical COBOTs (COllaborative roBOTs). The main benefit of such devices is to combine image-guided surgery and robotized gesture
353
354 assistance for protecting hidden anatomical structures without impairing the ability of the surgeon to intervene and make full use of his expertise during the operation. Our system called SURGICOBOT is based on a Virtuose haptic interface that features excellent transparency, suitable force capability and reasonable stiffness. A standard surgical instrument is fixed at the extremity of the haptic arm but is hold by the surgeon in the same manner as in a manual surgery configuration. The other key components of the system are an advanced haptic controller that implements the desired behaviours of the haptic arm and a virtual reality engine computing force constraints based on a simulation of the interactions between the surgical instrument and the modelled patient anatomy. SURGICOBOT was experimented with reference to the osteotomy of the lower mandible, a maxillo-facial procedure that is particularly suited to the capabilities of COBOTs. The preliminary step of the demonstration consisted in processing the pre-operative imaging data in order to optimise the efficiency (especially from the real time point of view) of the haptic simulation. The system was then tested in the frame of a virtual reality application interacting with a virtual model of the mandible. The final experiment was an "Augmented Reality" demonstration where SURGICOBOT interacted concurrently (i) with physical models capturing either the geometry or the consistency of a real jaw, and (ii) with the virtual models of these objects obtained through CT-scan and calibrated to match the position and orientation of their physical counterparts. The virtual simulation was there used to guide the surgeon gestures in the real world. 2.
Surgical needs and conceptual approach
Often, surgeons have to perform delicate interventions close to sensitive anatomical features that cannot be directly seen. Examples of such situations are found in maxillo-facial and orthopaedic surgeries that deal with relatively rigid bone structures. In these two disciplines, image-guided surgery systems were a significant innovation since they allow the surgeon to visualize the position and orientation of his instruments with respect to a reference generally constituted by a pre-operative medical imaging data set. However, the instruments are still manually controlled by the surgeon and even with enhanced visual feedback some technical gestures remain difficult to perform in 3D. The next step was thus the introduction of robots that hold the surgical instruments and accurately reproduce movements that are planned pre-operatively [ 3 ] . But these manipulators are lacking in flexibility and a strong need is now expressed for systems that combine the adaptability of the surgeon and the confidence of the robotized gesture.
355
A promising answer is provided by the COBOT concept. A COBOT is a system that operates in parallel with the operator, i.e. the surgical instrument is hold both by the surgeon and by the COBOT arm. Most of the time, the surgeon can freely move the instrument forgetting the attached robotic device, but in some predefined space areas the COBOT generates force constraints repulsing the instrument away from a sensitive anatomical feature, or attracting it towards a targeted position or trajectory. As the generated forces are generally modest, a COBOT acts as a guide (the surgeon may always apply higher forces) rather than a substitute. COBOTs are thus well suited to situations where the surgeon has to perform demanding gestures that imply both image guidance and peroperative surgical manual skill. If the COBOT concept is very attractive, it also raises difficult issues, some of them are still unsettled: Processing of the imagery data so as to generate force constraints in a very strict real time context. Efficient trade off between the transparency required for the surgeon to freely move the instrument fastened at the end of the COBOT and the necessary stiffness that secures the guiding accuracy. Specification of the COBOT workspace and force capacity taking into account the task constraints. Design of the human-machine interface, correctly assessing the respective part played by force, visual and audio feedbacks. To investigate these issues, SURGICOBOT was developed and experimented on the osteotomy of the lower mandible. The key point of this intervention is to avoid damaging the nerve that follows a narrow channel inside the mandible. The classical procedure consists in drilling a number of holes along the desired fracture line in order to weaken the jaw before breaking it with a small hammer. On the one hand, the mandible must be sufficiently weakened so as to break along the desired fracture line, but on the other hand the drill must not touch the nerve located inside. It is important to mention here that a living mandible is made both of resistant cortical bone and of spongy bone (where is the nerve channel) that can be easily penetrated. The surgical COBOT is not a new concept and two such systems are well known in the medical robotics community. The PADyC (TIMC) [l] has been developed for pericardic punctures and possesses no actuator but brakes, and is thus intrinsically safe. The second system is Acrobot (Imperial College) [4] that assists with active constraints surgeons performing TKR. As specified, these two dedicated systems feature rigid mechanical structures and therefore achieve a very different transparent / stiffness trade off than the supposedly more flexible SURGICOBOT.
356
3.
The Virtuose technology
SURGICOBOT largely relies on the Virtuose technology developed by CEAList for haptic Virtual Reality (VR) applications and transferred to Haption, a CEA spin-off. Two different haptic devices were used for the SURGICOBOT experiments (table 1 gives the specification of these systems). The first one is the Virtuose 6D35-40 that supports force-feedback on translations and rotations. It is a commercially available system for industrial applications especially targeting the design office market [5]. The second system, the Virtuose 3D, is actually the prototype of the Virtuose series. This lightweight and cable-driven arm moves along translations and rotations, but only provides translation force feedback. It was selected for the "Augmented Reality" experiments because it was easier to modify for holding the surgical tool and led to better human factor features. Table 1. Specification of the SURGICOBOT Virtuose interfaces. Workspace
Virtuose 3D 420x490x920 mm
Measured motions Force feedback Continuous force/torque capability Maximum force/torque capability Stiffness
UN
Virtuose 6D35-40 400x400x400 mm, 270x120x160° x, y, z, pitch, yaw, roll x, y, z, pitch, yaw, roll I O N / 1 Nm
34 N
35N/3Nm
2300N/m
3000 N/m 40 Nm/rad
x, y, z, pitch, yaw, roll x, y, z
These haptic devices are linked to a VR application in the frame of the global architecture [2] described on figure 1:
c
.
Position inputs
Operator
~~
~1
^^^
Graphical simulator
feedback
^^ force feedback
}L
r Haptic interface
Virtuose device
Joint positions ^ W
k^
Haptic controller
"^ ^ Actuators commands
Figure 1. Architecture of Virtuose VR applications.
^ ^ ^
Dynamic simulation engine
357 The haptic interface (Virtuose arm) captures the movements made by the operator and generates forces in his hand. The haptic controller manages in real time (1 KHz frequency) a bilateral coupling between the haptic interface and a virtual object (typically a tool). A Linux-RTAI PC 104 computer supports the haptic controller. The dynamic simulation engine manages in real time the interactions between the virtual object coupled to the haptic interface and its simulated environment. It makes use of standard physical simulation software that runs on another PC. The graphical simulator provides the visual (and audio) feedback to the operator. It is built using the Virtools VR development software and requires a dedicated PC that communicates with the dynamic simulation engine via a VRF" (Virtual Reality Peripheral Network) interface. In this context, the main functionality of SURGICOBOT may be achieved by creating a virtual protecting envelope around the mandible nerve and by simulating the collisions that occur between this envelope and the virtual replica of the surgeon drill. The size of the protecting envelope depends upon the COBOT stiffness and the force exerted by the surgeon on the drill. This force is estimated at 5 N when drilling into the cortical bone and around 1 N in the spongy bone. The development therefore consists in the following tasks: 1. Fix a standard surgical drill at the end of the Virtuose arm. 2. Create the numerical models of the relevant objects: surgical drill, mandible, nerve channel and its protecting envelope. 3. Complete and optimize these models with respect to the needs and constraints of the dynamic engine, and link it to the haptic controller. 4. Design the graphical simulation.
4. The SURGICOBOT experiments They were performed in two steps. The first one (using a Virtuose 6D35-40) was made in virtual reality conditions without any interaction between the surgeon drill and a physical object. Its purpose was to validate the processing of the imaging data with respect to the haptic constraints. On the other hand, the final experiments involved the Virtuose 3D arm, a real surgeon instrument and several mandible physical models. 4.1. Virtual reality conditions
The virtual anatomic models required for these tests were provided by CHU Amiens from a set of CT-scan slices. The protecting envelope of the nerve channel was then added and the drill model created with a standard VR modeler. At this stage, it was necessary to rework the skull and mandible anatomic
358 models in order to comply with the main constraints of the current dynamic simulation engines: reduce the number of triangles of the model polyhedrons, and eliminate or modify the model objects that are not closed and convex. This is currently a major problem for all VR applications, but progress is being made to reduce its impact. The haptic function also requires specifying the attributes of the virtual objects, mainly mass, centre of gravity, friction and inertia, as well as the collisions that are to be simulated: instrument with protecting envelope and instrument with nerve channel. Finally, the gains of the haptic feedback are determined and the graphical simulation developed, defining how collisions are displayed to the surgeon through color changes and sounds. Using the Virtuose 6D35-40, these tests demonstrated the very good quality of the force feedback provided to the surgeon when his instrument closes the nerve channel (figures 2 and 3).
Figures 2 and 3. The mandible model showing the nerve channel and its protecting envelope and the SURGICOBOT virtual reality demonstration in the CEA-List VR room.
4.2. Augmented reality conditions For the second and final experiments, several physical mockups (and the corresponding virtual models) were used: A wooden cube to calibrate the system and evaluate its accuracy. A resin skull generated from the same set of CT-scan data as in the first experiments. Its purpose was mainly to validate the COBOT assisted osteotomy from the point of view of kinematics. A composite model made from a layer of polystyrene framed by two PVC sheets. This mockup intended to reproduce the consistency of a real mandible (cortical and spongy bones) and was utilized to investigate the dynamic aspects of surgical haptic guidance. Its virtual counterpart therefore included a protected volume located in the polystyrene thickness. Thanks to the Virtuose interface, a straightforward calibration procedure was carried out, simply requiring the surgeon to designate three points with the
359 drill on the surface of the virtual model (virtual contacts are felt through haptic feedback) and then the three matching points on the corresponding physical model. From a general point of view, the transparency of the system and the efficiency of haptic guiding when drilling inside soft material were greatly praised by the surgeon. The SURGICOBOT static accuracy (when no force is applied on the drill) is close to 1 mm. On the other hand, problems were experienced by the surgeon to feel the repulsive forces when drilling into hard material due to the insufficient stiffness of the arm. In these conditions, the value of audio feedback must be stressed.
Figures 4 and 5 . The surgical drill operating on the resin skull and the experiments made with the composite model.
4.3. Results The results of these experiments are summed up below: The general feeling of the efficiency of SURGICOBOT is very good with a clearly perceived and good quality haptic guidance. The drill attachment on the COBOT and the arm terminal part must be designed with care to avoid the system fouling the surgeon hand and to allow the instrument to penetrate inside the patient’s mouth. 6 DOF mobility is absolutely necessary for the surgeon to feel at ease even if the drill rotation around its axis is kinematically redundant. Translation (3 DOF) force feedback is sufficient for this procedure. When working with the drill, the surgeon does not use the visual display. However, audio feedback is very helpful. After some testing, the retained solution was to generate a moderate sound when the haptic guidance is active and a louder one when the surgeon applies a force that allows the instrument to penetrate inside the protecting envelope. In this manner, the
360 surgeon was able to work quickly when away from the mandible nerve and the first sound warns him that he has to manipulate the drill more delicately. The system lacks a function for tracking the mandible movements during the operation as we cannot expect the jaw to be completely immobilized after the initial calibration. This capability may be provided by a standard surgical navigation system. A more rigid arm would be preferred to the current designs and a way to achieve this is to reduce the COBOT workspace as the performed motions are measured in cm rather than in tens of cm. 5.
Conclusion
These experiments have shown the COBOT concept to be an efficient way to assist surgeons for the mandibular osteotomy. This validation has now to be completed by an in-depth study aiming at characterizing the performances of the system (especially its dynamic accuracy) through a more realistic simulation of the surgical procedure and with several surgeons. Another important point is to extend the application field of COBOTs to other operations for example in rachis surgery. This obviously implies new validating experiments since 6 DOF force feedback seems mandatory for rachis applications and the involved haptic guidance is more constrained than in the case studied here. However, we think that COBOTs may become in the future general purpose surgical tools able to assist surgeons for a large set of interventions. From this point of view, the optimal transparency / stiffness trade off is still to be determined. References 1. 0. Chavanon, L. Carrat, C. Pasqualini, E. Dubois, D. Blin and J. Troccaz, Computer-guided pericardiocentesis: experimental results and clinical perspectives, Herz 25-8, p.761-8 (2000). 2. R. Gelin, C. Andriot, C. Montandon and L. Joussemet, Multi-modal immersion within the PHARE platform, IARP Joint Coordination Forum, Frascati, 2002. 3. C. Grueneis, R. Richter and F. Hennig, Clinical introduction of the CASPAR system, 4th International Symposium on Computer Assisted Orthopaedic Surgery, 1999. 4. M. Jakopec, S. Harris, F. Rodriguez y Baena, P. Gomes, J. Cobb and B. Davies, Preliminary results of an early clinical experience with the Acrobot system for total knee replacement surgery, MICCAI 2002, p.256-63. 5. L. Joussemet, C. Andriot and J. Perret, 6 DOF Haptic assembly by screw approach, IEEE Virtual Reality 2003, Los Angeles.
SENSOR-BASED INTRA-OPERATIVE NAVIGATION FOR ROBOT-ASSISTED KEYHOLE SURGERY JAN STALLKAMP Fraunhofer Institute for Manufacturing Engineering and Automation IPA NobelstraJe 12 70569 Stuttgart, Germany
This paper is based on the theory that manipulators represent an intermediate step in the development of robot-assisted keyhole surgery. The main reason for using only manipulators for elastic tissue instead of robots is because of a lack of suitable sensor data for the programming of robots. This paper presents a measurement device for the localized acquisition of data which can be used for realizing intra-operative navigation.
1. Introduction 1.1. Robot vs. manipulator in keyhole surgery From the surgeon’s present point of view, a robot or manipulator is primarily a supportive aid when performing keyhole surgery. With da Vinci@ (Intuitive Surgical Solutions) or ZEUS@ (Computer Motion) manipulator systems, these expectations are fulfilled by a master-slave system which scales the input movements and possesses a filter for reducing natural hand tremors. However, these movements are mainly subject to the limitations of manual instrument guidance. The improved accuracy of this movement mechanism as opposed to that of manual execution can only be completely exploited through the use of automated instrument guidance. This is the basis of the theory that manipulators are an intermediate step towards using robots in keyhole surgery. The advantages of utilizing robots for future day-to-day use in hospitals justify the technical expenditure for such systems. Another point in favor of this theory is that the first systems for keyhole surgery were designed as robots before manipulators were ever conceived, e.g. [ 1][2]. However, for technical reasons, these were not extensively used in hospitals. The reasons why robots are not utilized for keyhole surgery has nothing to do with the availability of movement mechanisms and controls -over six authorized solutions with various features have been developed. Also, path programming and calculation have not only been widely implemented in industry but also in orthopedic applications. The cause of this is that for these applications the data are available for planning path curves. Measurement equipment in use, such as computer tomography
361
362 (CT), magnetic resonance tomography (MRT) and surgical C-arms do not give up-to-date information about the position of important anatomical structures during surgery and are either too inaccurate or permit only a limited scope of spatial reconstruction. Especially in the case of keyhole surgery, the characteristics of elastic or sensitive tissue with complex geometric structures make these systems unsuitable for acquiring data to plan robot paths. This paper is concerned with the development of an optical device which will enable the planned intraoperative use of a robot for keyhole surgery in the future. 1.2. lntraoperative measurements during keyhole surgery
The patient’s own natural (pulsehreathing) movements, repositioning of the patient or the surgical intervention itself lead to a geometrical dislocation of tissue structures which cannot be predicted. With endoscopy, the tip of the instrument is in a gas-filled cavity and visible alterations are due to dislocation in surface geometry. In conventional keyhole surgery, the endoscopic image of these surfaces is adequate to enable the surgeon to carry out the operation manually. The programming of a robot in such a situation requires at least the additional measuring and imaging of visible surfaces relative to the tip of the instrument. In order to be able to compensate for movements of the surfaces, relevant surface segments also have to be recorded dynamically. Such dynamics can only be achieved through the use of ultrasound systems or localized measurement devices situated directly at the tip of the instrument. An ultrasound process appears to be a solution which is only partially suitable due to the fact that the sound source or transducer requires contact with the tissue and closerange measurements are only possible up to a certain extent. For the use at the tip of the instrument are those optical devices, especially based on triangulation, appear to be the most suitable as they are capable of a high degree of accuracy and resolution and their technology is easily available. A similar approach could be found in [3]. However, the functioning of optical systems is associated with errors because of the highly different surface conditions. Especially when measuring in organic environments, undesired effects may take place which make a measurement impossible or cause measurement errors to be made: Elimination of or reduction in relevant diffuse reflection fractions due to total reflection, absorption or transluminiscence Disturbance of the measurement due to complex surface geometries, e.g. irregularities in undercuts or multiple projections of the measurement spot. Measurement errors caused by the effects of refraction in the case of surfaces flooded for example with rinsing solutions or body fluids. As well as to realize an accurate measurement on that conditions, there is also a demand to miniaturize the instrument to diameters below 7mm. This aspect has
363 also essentially prevented the implementation of optical systems up till now. The objective was therefore to develop a suitable measurement technique capable of supplying up-to-date accurate results under keyhole surgery conditions.
2. Approach 2.1. Set-up of the measurement device
Typical triangulation sensors are based on measuring the displacement of a point on a CCD chip or a position-sensitive detector (PSD). Difficulties arise with this technique if the measurement device is miniaturized: on shortening the baseline, the measurement area is reduced or the degree of accuracy is influenced by the occurrence of obtuse angles. Under these general conditions, the fundamental concept described in Figure 1 has been used for the sensor. rotation axis ",:/
measurement spot (tissue) surface ' Figure 1 Representation of the measuring technique
The sensor is made up of two movable mirrors which can be illuminated by a laser in such a way so that the laser beams can be tilted on one plane in front of the instrument. During the measurement procedure, one laser beam marks the measuring point and the second beam is set to overlap the target beam. The distance between the reflection points is known from the construction and the actual mirror angle setting is also known once a special calibration process has been carried out. In this way, the position of the measurement spot relative to the reference coordinate system of the scanner can be calculated from the base side of a triangle and two adjacent angles. With this set-up, only one line can be measured initially. However, by rotating the scanner, the complete volume in front of the instrument can then be measured.
364
A major advantage of this method is its independence from the imaging properties of the lens; the latter is not required to supply quantitative statements but rather only binary ones, i.e. either “overlaps” or “does not overlap”. With a few limitations, the measurement area is only dependent upon the geometry of the scanner, i.e. the direction of the illumination and the mirror. Accuracy is obtained by recording the reference points fiom the individual measuring spots and is also due to the positioning accuracy of the mirror drive.
2.2. Compensating for parasitic drags In order to determine overlapping, the reference points of the 2-dimensional measuring spots projected onto the detector need to be calculated. The ideal situation rarely occurs where a geometrically-undistorted image of the measuring spot is obtained on the detector chip. Especially with organic surface materials, the characteristics mentioned in 1.2 lead to deformation of the measuring point compared with the original beam profile and also to poor contrasts and a random distribution of intensity. These characteristics do not permit the exclusive use of contour or surface-based methods nor an expected (Gaussian) intensity distribution. Consequently, the central point is calculated using each of the two methods and the results are then compared. Using a special formula which takes the resolution of the detector and spot expansion into consideration, a certainty factor is derived and then compared with a limiting value. If the values differ from one another, it is assumed that the result is uncertain and the measuring point is either marked or rejected. This technique mirrors the risk management of the system which has been kept simple. In general, faulty results from surface measurements are made obvious to the surgeon and barred from use in the path planning data. With flooded, transparent surfaces, the laser beam is refracted before it reaches the surface. If the tilting plane of the laser beams is no longer vertical in relation to the surface of the liquid, the measuring spot visible on the detector must be beyond this plane. Overlapping no longer occurs by tilting the second search mirror but rather requires a defined rotation of the mirror around its central axis. The solution of the resulting system of equations requires the calculation of the normal vector on the water surface at the point of penetration and a known refraction index. During an operation, the refractive index of transparent liquids can be approximated to that of water. The normal vector can only be numerically approximated using a further measurement step and a series of additional assumptions so that the system of equations can then be solved. 2.3. Functional model The measurement method described above was implemented in a functional model with an overall shaft diameter of 20mm (see Figure 2). The head of the
365 scanner with a diameter of 13.2 mm is made up of two mirrors, each with a surface area of 2x2mm2 set into a rotatable cylinder. The laser of the functional model is fixed at the center of the cylinder and illuminated both mirrors laterally via a beam splitter.
Figure 2 Functional model of the instrument.
The mirrors are driven by voice coil actuators via pre-stressed wire pulls. If the wire pull is pre-stressed appropriately, the mirrors can be adjusted quickly and continuously. Together with the path measuring sensors which formed part of the closed control loop, the drives and cylinders are fixed to a rotatable frame run on bearings. This is rotated via a step-by-step motion linkage during the measurement procedure. As a result of the rotation, it is possible to measure the surface in front of the instrument spatially and to compensate for the refractive effect by overlapping the measuring points. From the analysis and talks with surgeons, it became clear right at the start of the development that it was necessary to integrate the video endoscopic image into the overall concept. This was due to the fact that the resolution and color information could be replaced by no other measurement device to date. To improve the use of space, an endoscope was used both for conventional video image representation for the surgeon and also as a detector. To do this, for test purposes two endoscopes
366 outside the rotatable frame were attached to the instrument and integrated into the shaft with the guide cylinder.
2.4. Sequences The implementation of this measurement device leads by necessity to altered running sequences when using a robot for keyhole surgery. The constantly updated data enables planning to become intraoperative rather than preoperative; in the process the accurately navigated and dynamic alterations in tissue structure can be taken into account. However, intraoperative planning demands new user-interfaces and application flows in the sterile area. In this case a surgeon plans the next step in an intervention based on the actual situation at the operating site. The planning procedure is repeated after each step until the operation has been completed. Consequently a surgeon must be able to carry out path programming easily and safely directly at the operating table. Suitable interaction systems for use in sterile areas have already been developed in an earlier project [4]. A new concept was developed for a graphical user interface (GUI) with which the surgeon is guided step by step via a standard operating sequence from an initial point through the required settings in the same way, (see Figure 3).
-C-*i*i
-
-
New Task
next task
selected task c
Figure 3 Sequences of the planning process with a billiard ball as a video test image (example)
The video endoscope image is a main source of information for this. As the endoscopic image is simultaneously used as a detector, measurement data can be directly arranged as spatial vectors into the pixels on the video frame. The video
367 frame is thus presented to the surgeon with a spatial I three-dimensional texture in which he can enter planning data in the form of points, lines or planes. 3.
Discussion
Up till now, the main emphasis has been on testing the measurement features of the triangulation sensor in a simulated operation scenario. The experiments were able to show that an accuracy <0.1 mm could be achieved, even in environmental conditions subject to the above-mentioned disturbing influences (see Figure 4).
Figure 4 First Reconstruction results with low horizontal resolution (2mm)
However, with the frame grabber used up till now, it has not yet become possible to attain the desired measuring speed of approx. 10Hz. As well as continuing to miniaturize down to the required diameter, realistically the next step still appears to be the technical realization of an instrument for use in operating scenarios. 4.
Conclusion
The use of the described measuring device in robot-assisted keyhole surgery opens up the path towards utilizing robots in operations on elastic tissues which alter constantly during an intervention. The next step is to develop a system for pre-clinical use to prove the theory that the efficiency of an operation can be increased through the utilization of an intraoperative navigation system. To do this, a combined procedure is conceivable where the movement mechanism is partially used as a robot and partially - as is the case with existing systems used in keyhole surgery - as a manipulator.
368 References 1. S.J. Hams, Q. Mei, F. Arambula-Cosio, R.D. Hibberd, S. Nathan, J.E.A. Wickham, B.L.A Davies. Robotic Procedure for Transurethal resection of the Prostate. Second Annual International Symposium on Medical Robotics and Computer Assisted Surgery (MRCAS '95),Baltimore, Maryland, 264271,1995 2. D. Glauser, H. Frankhauser, M. Epitaux, ,J.-L. Hefti, A. Jaccottet. Neurosurgical Robot Minerva. First Results and Current Developments. Medical Robotics and Computer Assisted Surgery (MRCAS '95),2.Annual Symposium, Baltimore, 1995. 3. M. Hayashibe, Y. Nakamure, Laser-Pointing Endoscope System for Intra-Operative 3D-Geometric Registration, Proceedings of the 2001 IEEE International Conference on Robotice & Automation, 1543-1548, 2001. 4. J. Stallkamp, R.D. Schraft, A. Hiller. Bedeutung von Peripheriesystemen fur den medizinischen Robotereinsatz. 2. Symposium Neue Technologienfur die Medizin, Shaker Verlag, 2000.
ROBOTIZED DISTRACTION DEVICE FOR SOFT TISSUE MONITORING IN KNEE REPLACEMENT SURGERY C. MARMIGNON, A. LEMNIEI, P.CINQUIN
TIMC Laboratory, GMCAO Team, 13s Grenoble, La Tronche 38707, France
Knee replacement surgeons cannot currently quantitatively predict the optimal tradeoff between alignment and soft tissue balance. The system developed at the TIMC laboratory helps the surgeon to address this issue by providing a variety of measurements. Our system consists of a distraction device to apply separating forces between the tibia and femur, coupled with a computer-assisted surgical navigation system. In contrast to existing devices, our device works with the patella in its normal anatomical position and is used prior to making the femoral bone cuts. We performed a pilot experiment on two cadavers; the surgeon was able to assess soft tissue balance but judged the device as not powerful enough. We have now designed a more powerful prototype based on hydraulic technology to apply larger loads.
1. Introduction Since a prosthesis is not a knee, a surgeon must evaluate the patient’s joint to choose an adequate prosthesis and to prepare the knee to receive it. The surgeon triesto match the prosthesis geometry to that of the patient, while balancing the forces on the prosthesis. The surgeon will therefore consider the behavior of the ligaments after the prosthesis replaces the bone surfaces and will create a prosthesis space within the ligament envelope. The prosthesis space is bordered by the bone cuts and defined by both articular spaces in extension and in flexion. The extension space is established through the alignment of the limb, and must be identical with the flexion space.
Figure 1: The question of the ligament balance remains a key issue.
369
370 To achieve those spaces the surgeon can be led to rotate the femoral component or release (incise) the soft tissues because of envelop deformations. However surgeons cannot currently quantitatively predict the optimal tradeoff between alignment and soft tissue balance. This inability is significant because revision procedures are often performed because of instability, which is primarily attributable to incorrect ligament balancing. Devices known as tensors are sometimes used to assess ligament balance; they are typically used after the tibial cut is made to distract the bones until the ligaments are tight in order to indicate the shape and size of the gaps (e.g. the Insall, Freeman and Cores systems). A major problem with these tensors is that they require eversion of the patella, which distorts the soft tissue envelope of the knee. To avoid this problem, Muratsu developed a tensor which leaves the patella in place, thus reproducing the normal condition of the knee and improving the measurements, especially in flexion. It is used once the femoral cuts are done; ligament balancing is performed by releasing the appropriate ligaments. Other systems (e.g. Balansys, Centerpulse) control the distraction force, thereby improving the consistency of the measurement. The Centerpulse tensor is used in parallel with a navigation system and the external rotation of the femoral component is monitored. These measurements are biased in flexion because they are used with the patella upturned. Other navigation devices (e.g. Surgetics) measure the laxity without a tensor using quick manipulations of the limb after the tibial cut is made. We propose a system which improves on these others by (1) being used after making only the tibial cut, (2) leaving the patella in its normal position, (3) applying a consistent distraction force throughout the range of motion of the knee, and (4) tracking the distraction with a computer-assisted navigation system.
2.
Material & Method
Once the surgeon incised the knee, the patellar tendon is turned over. The tibial cut is then carried out which frees at least a minimal space of 5 mm. The range of distraction height was estimated at 15 mm in order to cover every femoral shape and ligament length. The force was estimated at 100 N on each compartment. The system has a base plate carrying two independent superior trays placed under the two femoral condyles. Distraction is applied between the base and each superior plate through a jack. At the place of the screw and crank, the force
371 is applied through a cable which is attached to a PID-controlled rack-and-pinion actuator manufactured by Alpes-Instrument, France. An I/O analog card connected to the computer provides the position command input. A graphic interface was developed and integrated into the generic Surgetics orthopedic navigation system. The protocol starts by acquiringthe joint centers, followed by bone-morphing , then the planning and navigation of the tibia1 cut, and stops at the ligament balancing step once the ligament releasing is carried out and the femoral cuts are defined. On the distraction interface the gap, the articular space, the ligament lengths and the distraction forces are monitored.
Figure 2: The tension is applied in the frontal plane by applying a medial and a lateral force, both parallel to the mechanical axis of the tibia.
The distraction forces and the heights of the superior plates are measured separately. The force measurement is acquired via a low cost Force Sensor Resistor (FSR174, Alcyon Electronique, France) requiring a simple electronic interface. It is inserted between the two plates composing each superior plate and sealed with a silicone paste. The two plates assumed to be absolutely rigid to distribute the force uniformly on the sensor which is connected through a
372 resistor bridge to an analog acquisition card. The height of the superior plates is measured by two hall-effect sensors (UGN35, Honeywell, US) mounted on each superior plate, through the magnetic field emitted by a rare ground magnet (101MG7, Allegro, US) integrated in the base plate. The UGN are directly connected to the analog acquisition card. 3.
Results
The prototype has a height range from 6.1 to 21.1 mm and a parallelism of less than 0.5 degrees. The law between the force exercised by the cable and the resultant force on the knee follows the ratio Tan(A), where A is the half-angle of the parallelogram. To avoid bending axes, the cable force is limited (safety factor of 10 with respect to the yield point) and the maximal resultant force is 20N in the low position. The control platform of the cable tension is not dedicated to our application. Consequently, the control of the position, the length stroke and the range of the force are limited. By increasing the speedreduction ratio the maximum tensile force is 60N. Two length strokes were tested: a small stroke of 50 mm and a long one of 200 mm. The small stroke allows good precision but only suits small forces. This precision improves the control of the hysteresis effect of the sheath. The long stroke was chosen despite a lower precision in order to have a greater force range. Before the calibration the resistor bridge is calculated to give the greatest voltage output dynamic according to the range of force. An S-shaped force sensor is mounted on a base and the force applied relative to the acquired FSR output voltage compared. Two types of measurements were tested; one was in static and the other in dynamic acquisition. The curve between the force and the voltage output reverse was approximated by a 5th order polynomial. The measuring accuracy was about 200 g around the point of the contact, when the force applied was higher than 1 kg. This inaccuracy is related to the elasticity of the silicone joint and the nonabsolute rigidity of the plates. The play in the parallelism (+0.5") due to the mechanical fit induced an error about 2 mm on the height measurement. We tested the distraction device on two embalmed cadavers. The cadaveric knees were nonpathological and in very good condition. The bodies weighed 40 kg and 90 kg. The acquisition of the articular centers and the bone-morphing procedure were performed successfully. The tibia1 cuts were both 10 mm deep. On the first cadaver, the distraction was applied during manual manipulations of the leg and unfortunately the applied torsion tore the silicone joint; this problem was solved by insertion of two free teeth between the superior plates. On the second cadaver, we found that the distraction was more difficult due to the
373 heavy weight of the leg; the surgeon had to initiate and help the distraction manually during the manipulations. We also found that manipulation of the foot produced variations in the measured forces but it was difficult to manually control these forces. Despite the difficulties, several trials were carried out in which we tracked the articular space; the distraction forces and the surgeon felt that the information we gathered would be useful if it were easier to obtain. We therefore decided to develop a second more powerful prototype.
Figure 3: The new prototype is based on hydraulic technology.
374
4.
Discussion
Our initial cadaver trials were for the most part successful. The surgeon was able to assess soft tissue balance but judged the device not powerful enough. Since the first prototype was limited to 100 N, we designed the new more powerful prototype to apply loads of 200 N. Distraction is applied between the base and each superior plate through two rubber bladders which can be inflated until they apply 100 N each. The back of the base plate is shaped to fit around the posterior cruciate ligament if it exists. The bladders are inflated with water via a screw mechanism (currently manually operated, but easily motorized). The height of the distractor can range from 6 to 16 mm, and additional blocks of 1, 3 and 5 mm can be added if necessary. The bladders are inflated, and the resulting force computed from pressure sensors. A computer-assisted navigation system is used to find the bone gap. In contrast to most existing soft tissue assessment devices, the measurements can be made after a preliminary tibia1 cut, but prior to any femoral bone cuts and with the patella in its normal anatomical position.
References 1. “Computer assisted knee surgical total replacement”, F. Leitner, F. Picard, R. Minfelde, H.-J. Schulz, P. Cinquin, D. Saragaglia in Computer Science Series, 1997, vol 1205, pp 629-638, eds Troccaz, Grimson, Mosges, Springer Verlag. 2. “Determining proper femoral component rotational alignment during TKA”, C. W. Olcott and R. D. Scott, American Journal of Knee Surgery, 2000. 3. “Contribution of computer navigation in the evaluation of femoral rotation during total knee replacement”, D. Saragaglia, C. Chaussard, P. Liss, H. Pichon, D. Berne, M. Chaker, Computer-Assisted Orthopedic Surgery, 2002. 4. “Perfect balance in total knee arthroplasty: the elusive compromise”, MJ. Winemaker, Journal of Arthroplasty, 2002 Jan. 5. “Evaluation of joint gap in TKA (A new tensor enabling the measurement without patellar eversion )”, Muratsu H., N. Tsumura, M. Yamaguchi, K. Mizuno, R. Kuroda. T. Harada, S. Yoshiva, M. Kurosaka, American Academic of Orthopaedic Surgeons, February 2003. 6. Knee Distractor, C. Marmignon, P. Cinquin, S. LavallCe, French Patent, No 03 102413,27 102 12003.
STATE OF THE ART OF SURGICAL ROBOTICS PETER P. POTTI I ) Laboratory of Biomechanics and experimental Orthopaedics. Department of Orthopaedic Surgery, University Clinic of Mannheim; Theodor-Kutzer-Ufer 1-3, 68167 Mannheim, Germany ANDREAS KOPFLE~ 2) Institute of Computer Science V, University ofMannheim; B6, 23-29, 68131 Mannheirn, Germany
ACHIM WAGNER3. 3) Automation Laboratory, University ofMannheim; B6, 23-29, 68131 Mannheim, Germany ESSAMEDDIN BADREDDIN3 REINHARD
MANNER^
PETER WEISER Institute for CAE, University ofApplied Sciences Mannheim; Windeckstrape I1 0, 68163 Mannheim, Germany
HANNS-PETER SCHARF Department of Orthopaedic SurgeT, University Clinic of Mannheim; Theodor-KutzerUfer 1-3, 68167 Mannheim, Germany MARKUS L. R. SCHWARZ' This work describes the worldwide state of the art concerning the development of robotaided operative medicine. It is based on literature-, proceedings- and internet-research. 84 robotic systems are presented briefly by describing applications workgroups, project status and reference. The examination of the international research and development in the field of robotic applications in medicine shows a wide distribution in different areas of medicine. Only a few systems can be used clincally.
1. Introduction
Medical robotics are in state of continuous development [l] [2]. Research is done at university research facilities as well as on the part of commercial perspectives. Among the established applications in neurosurgery, orthopaedics and cardiac surgery new developments provide individual procedures for trauma 375
376 surgery and urology. Some devices fit in MRI scanners others are small enough to be held in the user's hand. A number of systems are based on industrial robots others provide especially developed kinematics for their application. The current state of the art in medical robotics and in particular the development of mechanical components is summarised here. According to their application, the devices are assigned to the medical disciplines: (I) Surgery, (11) Orthopaedics, (111) Imaging, (IV) Neurosurgery, (V) ENT/OMS, (VI) Urology, (VII) Trauma surgery, (VIII) Radio surgery.
2.
Material & Methods
An inquiry on the literature databases Pubmed, IEEExplore and CiteSeer has been carried out. Additionally, link lists of institutes working in the field of robot-aided medicine and the internet were searched. The proceedings of MICCAI 2002 and 2003, CARS 2003 and CURAC 2003 were reviewed. Above all developments of the last two years were considered. The nomination of the systems and projects is done according to the disciplines mentioned above and within these arranged in alphabetical order.
3.
Results
The results are combined to table 1. Table 1: Robotic Systems Task
Project name
Institute / Company
Country
Status
Reference
1) sursery Kcmote controlled ACTIVE TROKAK L k ~ t~,IMechdno-lnlormarlcs,Japan surgery Uni Tokyo, Japan
set-up
Remote controlled ARTEMIS surgery
experimental www.iai.fzk.deimedtechimedrobia set-up rlemisJwelcome.html
ehpenincnl~d www ynl I
utokyo.ac.jplindex.htmI
Biopsy under image B-ROB1 control Remote controlled DAVINCI surgery
lnstitut fiir angewandte Germany Informatik, Forschungszentrum Karlsruhe Austrian Research Centers, Austria Seibersdorf Intuitive Surgical Ltd., USA Sunnyvale, CA
experimental www.arcs.ac.at set-up commercially www.inNitivesurgicaI,com available
Remote controlled surgery
ENDOXIROB
LIRMM, CHU, LAAS, Sinters France SA, Toulouse
experimental www.endoxirob.com set-up http:/l~w.lirmm.frl-michelin/
Remote controlled
HYPERFINGER
Department of Micro System
Japan
expenmental www.mech.nagoya-u.ac.jp Pet-""
USA
experimental www.cisst.org set-up experimental http:iiwww.n.cmu.eddprojectsipr set-up oject-32.html
enpineerine. Naeova IJnivenitv
SUIIeN
Different surgical interventions Tremorcompensation for
LARS MICRON
CISST, Johns Hopkins University Robotics Institute, Camegie Mellon, Pittsburgh, PA
USA
377 TreniorcumDensdtiun
lor
microsurgety Punction under MRI-control
MICROSURClCALComputer Integrdted Suracal USA ASSISTANT Svstemr and Technoloev. I.. Baltimore, MD MlRA lnstitut fiir angewandte Germany Informatik, Forschungszenmm Karlsruhe PADYC Laboratom TIMC, Grenoble France
Tremorcompensation for microsurgery Remote controlled RAMS I AMES surgery Biopsy under MRI- ROBITOM control Remote controlled Telesurgical surgery Workstation Remote controlled ZEUS surgery Surgery in MRI project name unknown
let Propulsion Laboratory, USA Pasadena, CA lnstitut f i r angewandte Germany Informatik, Forschungszentrum Karlsruhe Medical Robotics Group, USA Berkeley, CA Intuitive Surgical Ltd., USA Sunnyvale, CA Surgical Assist Technology USA, Group, Tsukuba Japan
Tumour-biopsy with project name ultrasonic control unknown Liver biopsy under project name MRI-control unknown Biopsy under CT- project name control unknown Remote controlled project name unknown
ATRE-Lab, University of Tokyo ATRE-Lab, University of Tokyo Marconi Medical Systems, Highland Heights, OH Deutsehes Zentrum f i r Luft
Knee replacement
ACROBOT
UK
Percutaneous vetebroplastic
ACUBOT
The Acrobot Company Ltd., London Brady Urologcal Institute, Johns Hopkins University, Baltimore. MD
Hip replacement
ARTHROBOT
Korea
Hip and knee replacement
CASPAR
Telerobotics and control Laboratory, Seoul URS GmbH, Schwerin
Skull reconstruction CRANIO
Japan Japan USA Germany
expenmental http ;is\tweb cajhu edu research set-uu Micmsurelcdl A,siJlanL expenmental http:lihbksunl7.fzk.de:8080/imbli set-up mb-wwwl experimental wwwset-up timc.imag.friOlivier.Schneider1per solenglishlgb rsp main.html experimental http:lltelerobotics.jpl.nasa.govltas use kdramd experimental http:l/hbksunI7.fzk.de:8080limbli set-up mb-wwwl www.innomedie.de experimental http:llmbotics.eecs.berkeley.edu/ set-up medical commercially www,computermotion.com available expenmental http:llunit.aist.go.jpihumanbiomed set-up /surgical http:llsplweb.bwh.harvard.edu:800 experimental set-up experimental set-up experimental use experimental
www.atre.t.utokyo.ac.jp/index.htmI http:llwww.i.u-tokyo.acjpim-iimi-e.htm Yanofet 81.131 www.robotic.dlr.de
und Ranmfahn Wesslinv
USA
commercially www.acrobot.co.uk available experimental http:iiwww.visualization.georgeto wn.edu/Researchllmage_Guiderin use maee G u i d d h t m
experimental set-up Germany no longer commercially available Lehrstuhl fir Biomedizimsche Germany. experimental . Technik, RWTH-Aachen set-up Lehrstuhl fiir Biomedizinische Germany experimental Technik, RWTH-Aachen set-up
http:llrobot.kaist.ac.kdproject/hwr slarthrobot/main.htm
www.medicalrobots.com
www.hia.nuthaachen.delresea~hlcht/cranio.html www.hia.rwthaachen.delresearchlcht/Cngosl.ht ml commercially www.pisystems.ch available experimental http:llwww.mech.kuleuven.ac.bel set-up bmgolresearchlproject-robot-en.p hrml experimental www.intelligent-tool-dnve.de set-up
Image-guided hip surgery
CRIGOS
Knee replacement
GALILEO NAV
Precision lmpants AG, Aarau
Knee replacement
IMAGE REGISTRATION
Division BMGO, Leuven
Pedicle screwing
ITD
Labor f i r Biomechanik und experimentelle Orthopadie, Mannheim
Pedicle screwing
MARS
Bone cement removal
MINARO
Technion - Israel Institute of Israel Technology, Haifa Lehrstuhl fiir Biomedizinische Germany Technik, RWTH-Aachen
experimental h~:llmeena.technion.ac.il use experimental www.hia.nvthset-up aachen delresearchlcht/minaro.htm
Hip replacement
MODICAS
Germany
expenmental www.modicas.de use
Implantation of screws
ORTHOSISTA
lnstitut fiir Regelungs- und Steuertechnik, Zentrum fiir Sensorsyteme, Siegen Armstrong Healthcare Ltd., High Wycombe
UK
experimental www.armstrong-healthcare.com set-up
Switzerland Belgium
Germany
378 Knee replacement
PFS
Center for Medical Robotics and Computer Assisted Surgery, Pittsburgh, PA
USA
experimental www.mrcas.ri.cmu.edu set-up
Integrated Surgical Systems
USA
cammerclally www robodoc corn
Berufsgenossenschaftliche Unfallklinik FrankfurUM Fraunhofer- IPA, Stuttgart
Germany
experimental use experimental set-up experimental set-up
___
Hip replacement
ROBODOC ~
~~~~~
Hip replacement
ROBONAV
Pedicle screws
VISAROMED
Orthopaedic surgery project name under MRI control unknown ~
Germany
Advanced Therapeutic Japan Engineering Laboratory, Tokyo
da www.ipa. hg.delmedizin
www.atl.b.dendai.ac.jpllab1atlabe.hm
~_____
Spine surgery
project name unknown
Guidance of endoscopic camera Guidance of endoscopic insmments Intestismal Endoscope
AESOP
Advanced Therapeutic Japan Engineenng Laboratory, Tokyo Denki University
Intuitive Surgical Ltd. Sunny- USA vale, CA lnstitut national de recherche enFrance informatlque et en automation, Le Chesnay CROBOT, Computer Integrated Medical Singapore ENDOCRAWLER Intervention Laboratory, Singapore Japan Telesonography EDR Department of medical Informatics, Ehime Guidance of ENDOASSIST Armstrong Healthcare Ltd., UK endoscopic camera High Wycombe Guidance of ENDOSISTA Armstrone Healthcare Ltd., UK endoscopic camera High Wycombe Guidance of FIPS Forschungszentrum Karlsruhe Germany endoscopic camera COPRIN
France
experimental www.atl.b.dendai.ac.jpllab1atlabSet-uP e.htm
commercially www.intuitivesurgicaI.com available experimental http:llwwset-up sop.inna.frlcopridindex.htm1 experimental http:llmrcas.mpe.ntu.edu.sglresear set-up chicrobot/index.htm experimental use commercially available exverimental set-up experimental set-up
ww.medinfo.m.ehime-u acjp www.armstrong-healthcare.com
www.armstrona-healthcare.com
http:/hbksunl7.fik.de:8080limbld eihorne.html?medisystemelfips.ht ml-topmain expenmental http:/lwww.lirmm.frl-duchemidH set-up ippo.htm experimental www.bmse mech.nagoyaset-up u.ac jplindex-e.html experimental http:lhme.pe uset-up tokyo.acjp/index-e.html
Telesonography
HIPPOCRATE
Sinters SA, Toulouse
lntestisinal Endoscope Guidance of endoscopic camera Guidance of
HYPER ENDOSCOPE LAPARONAVIGATOR, NAVIOT LER
Biomedical Micromechanics Japan Laboratory, Nagoya Biomedical Precision engineer- Japan ing Laboratory, University of Tokyo Laboratoire TIMC, Grenoble France
Telesonography
OTELO
Sinters SA, Toulouse
France
experimental http:llww.bourges.univset-uv orleans.fr/otelo/home.htm
Telesonography
TER
Laboratoire TIMC, Grenoble
France
expenmental http:llwwwSet-uP timc.imag,frigmcaolindex.htrol
Intestisinal Endoscope
project name unknown
Dept. of Mechanical Engineer- USA ing, Caltech, Pasadena, CA
experimental http:l/robotics.caltech.edd-jwblm Set-uP edical.html
EVOLIJTIOF; I
UllS GmbH. Szliwenn
no 1ongr.r commercially available experimental use experimental Set-uP
experimental http:llwww-timc.imag frl
1v)Ne-Urgay
Guidance uf look and endoscopes
Germdny
Guidance of tools MINERVA under CT-control Guidance of tools NEUROARM under MRI-control
Group for surgical robotics and Switzerinstrumentation, Grenoble land Department of clinical neuCanada roscciences, Calgary
Micro-forceps
NEUROBOT
Japan
Milling of lateral skull-base
NEUROBOT
A T E - L a b , University of Tokyo CIMIL-Lab, Singapore
Stereotactic neurosurgery Stereotactic neurosurgery
NEUROMATE
Integrated Surgical Systems Ltd. Davis, CA Armstrong Healthcare Ltd., High Wycombe
USA
NEUROSISTA
wuw
mudicalrohxi cum
http:lldmtwww.epfl.chiimVrobchir iMinerva.html http:llww.mdrobotics.caheuroar m.htm
experimental www.atre.t.u-tokyo.ac.jp use Singapore experimental http:llmrcas.mpe.ntu.edu.sglresea set-up rchineuroboVindex.htm
UK
commercially www.robodoc.com available experimental ww.armstrong-healtbcare.com set-up
379 Stereotactic PATHFINDER neurosurgery Stereotactic project name neurosurgery in unknown open MRI Remote controlled project name instrument guidance unknown in open MRI Brain retract project name manipulator for use unknown in MRI-scanner Stereotactic project name neurosurgery in unknown open MRI Tmnsnasal neuro- project name surgery in MRI unknown Microsurgery
project name unknown
Armstrong Healthcare Ltd., UK High Wycombe Mechanical Engineering Japan Research Laboratory, Hitachi; ATRE-Lab, Tokyo Mechanical Engineering Japan Research Laboratory, Hitachi; Waseda University Faculty of advanced Techno- Japan Surgery, Tokyo Women's Medical University Advanced Therapeutic Japan Engineering Laboratory, Tokyo Denki University Surgical Assist Technology Japan Group Tsukuba, lbaraki
commercially www.amtrong-healthcare.com available experimental http:llsplweb.bwh.harvard.edu:8OO set-up 0lpageslppUnobylrobotlmrtrobot.h tm experimental Tajima et al. [4] set-up
Department of Micro System Engineering, Nagoya Univer-
Japan
experimental http:Nwww.mech.nagoyause u.ac.jp1index-e.html
Gemany
experimental set-up experimental use expenmental set-up
,,.,
experimental www.humu.ac.jplABMESffATSli set-up ndex-e.html www.mech.waseda.ac.jp/ experimental Masamune et al. [ 5 ] set-up experimental http:llunit.aist.go.jpihumanbiomed use /surgical/
&hl
V)Oral- and mallofacial s um;ear-, nose-, throat surgery surgery of the sphenoid sinus Prosthesis placement Skull surgery
A13
HNO-Klinik, Erlangen
Otto von der Decke Surgical Robotics Lab, Berlin
RobacKA
Germany
Uni Karlsruhe, Uni Heidelberg, Germany DKFZ
www.hno.med.uni-erlangen.de www.medint.de wwwsrl-berlin.de http:llst%414.ira.uka.del
Mallofacial surgery ROBOPOlNT
Surgical Robotics Lab, Berlin
Germany
experimental wwwsrl-berlin.de use
Hearing aid implantation Hearing aid implantation
Lehrstuhl fir angewandte Informatik 111, Bayreuth Laboratorium fur Medizinrobotik, Sektion sensorische Biophysik. Tiibingen
Germany
experimental http:llai3.inf.uni-bayeuth.de set-up experimental http:llwww.medizin.uniset-up tuebingen.deihno1mednavrobotiWproiekt . . pmjckr him
RONAF project name unknown
Germany
VI) Urology Kidney biopsy
Resection of the prostate Resection of the pr0state Kidney biopsy
~-
PAKY (ACUBOT)
Dmdv Uroloaical Inttitule. Johns Hopkins University, Baltimore, MD
USA
PROBOT
Imperial College, London
UK
- ihu , edu ~uruhuuc\ ~ ~ exDcrimmlaI hun. iurolocv nr. ~~
use
Active Holder for MR-guided surgery Transrektal prostate project name hiop\y unknonn Vn)TrSUmS SUr@Iy Aulomrlic Jcrm.i- DL.K.MAKOH
ATRE-Lab, University of Tokyo A m - L a b , Tokyo 1; Johns Hopkin>. Ballmore. USA
experimental use Singapore experimental use Japan experimental set-up Japan I exuerimental USA use
I.IRMY UMR 55OoCNRS.
Frmcc
tom
Sinters SA, Toulouse
UROBOT
CIMIL-Lab, Singapore
~~~~~~~
ojectslrcd http:llrobotics.me,jhu.edui-llwlpa ky/paky.htm http:llwww.me.ic.ac.uWcase/mim/
pmjectslprobotl http:llmrcas.mpe.nru.edu.sgireseat chlurobothdex.htm www.atre.t.utokyo.ac.jp/index.html h~:I/ciss~eb.cs.ihu.edu/oeoole/e . . ., abor
expenmcniul http ,141 Ihl I 6 5 ISU Robuusp,
set-"
20Workshoplpresentatiod CARS250603-
E_Dombre_fileslframe.htm Bone reposition
REPROBO
Mechatonics Faculty, FH Regensburg
Germany
experimental bttp:lihomepages.fhset-up regens-
burg.del-mog39099lmk.orgImrdp mjekffrepmbo/reprobo.htm Bone reposition
Automatic dermatom
project name unknown SCALPP
Institute for robotics and
Germany
process control, Uni Braunschweig LlRMM UMR 5506 CNRS; Sinters SA, Toulouse
experimental http:/lwww.cs.tuset-uP
France
bs.delmb1welcome.htmI
experimental http:/l141.161.165.150iRobotics% set-up 20Workshoplpresentatiod CARS250603E Dombre files/frame.htm
380
~
Tumour ablation
CYBERKNIFE
Accuray Ltd. Sunnyvale, CA
Tumour ablation
project name unknown
Medical Applications Research Germany Group; Liibeck
USA
commercially www.accuray.com available experimental http:llwwwradig.lnfomatik.tuset-up muenchcn.delresearchED/index e.html
During this review 84 systems were identified. Of these 20 have been developed for abdominal and thoracal surgery, 19 for orthopaedics, 14 have been developed for imaging, 14 systems for neurosurgery, 6 for oral- and mallofacial surgery (OMS) and ear-, nose- and throat-surgery (ENT) respectively, 5 systems have been built for urological procedures, 4 systems are for trauma surgery and 2 systems have been developed for radio surgery (see fig. 1).
Urology-
Radio Surgery 2%
OMS. ENT
7%
1
Neurosurgery 17%
Orthopedics 23%
17% Fig. 1: Summarisation of robots of different disciplines.
51% of the presented systems come from Europe, 26% of the systems are being developed in Asia and 23% in North America respectively. About two thirds of the examined projects are in experimental state and have not yet been tested on patients. 23% have been or were tested on patients experimentally and about 14% are commercially available in some countries.
381
4.
Discussion
The aim of research and development in the field of medical robotics is the simplification of surgical procedures and the improvement of the surgical outcome, i.e. supporting the surgeon in certain situations. Whether this can be provided with the presented robotic systems is not part of this survey. Well-known surveys as for example by Cleary and Nguyen [ I ] usually provide only a restricted survey of the state of the art concerning the number of systems described. Others are limited to a certain medical field [6] [7] [8] or written by the user of a particular technology [9] [lo]. This listing at hand provides an overview of the worldwide state of the art in medical manipulator systems. The data collection on which the review is based arrives at the conclusion that most of the currently existing systems are presented. The reader is provided with references for further studies. An accurate assignment of systems, workgroups and medical fields is partially difficult for us due to the varying public relations of the institutes involved, especially in the Asian hemisphere. Some systems are used by two or more groups for different purposes (e.g. ACUBOT / PAKY and CRIGOS / CRANIO / MINARO). From a technical point of view most systems are designed only for one purpose. But some set-ups are used by different groups for varying types of procedures. Systems designed for especially one application usually appear to be more compact and provide a less number of DOF. Alternatively used industrial robots are modified to the surgical working field. These robots typically are more effective and bigger than necessary for the surgical task, but their acquisition and handling is easier. Reflecting the international work on research and development in the field of medical robotics a wide distribution of the technology is obvious. The systems perform several tasks such as milling cavities in bone, harvesting skin, screwing pedicles or irradiating tumours. In everyday life of surgeons only a very small number of systems is used. In addition all systems that were not clearly described as being in experimental use were considered as experimental set-ups that are not yet tested clinically. The small acceptance can be explained by the high complexity, the strict safety precaution and FDA-clearances and the fact that most of the systems are developed in a university environment thus being not available as commercial product.
5.
Conclusion
The review presents more than 80 robotic systems for the use in several medical disciplines including imaging. The systems mainly are in an experimental state.
382
6. Acknowledgments The work on project ITD is funded by the German Research Society under grants SCHA 952 /1-1 and MA 1150 /39-1.
References 1.
2. 3.
4.
5.
6. 7. 8.
9. 10.
Cleary K, Nguyen C (2001), State of the art in surgical robotics: Clinical Applications and Technology Challenges. Computer aided Surgery 6: 3 12328 Pott P, Schwarz M (2002), Robotik, Navigation, Telechirurgie: Stand der Technik und Marktiibersicht. Zeitschrift fur Orthopadie 140: 218-23 1 Yanof J, Haaga J, Klahr P, Bauer C, Nakamoto D, Chaturvedi A, Bruce R (200 l), CT integrated robot for interventional procedures: preliminary experiment and computer-human interfaces. Computer Aided Surgery 6: 352359 Tajima F, Kishi K, Kan K, Ishii H, Nishizawa K, Fujie M, Dohi T, Sudo K, Takamoto S (2003), An MR-compatible master-slave manipulator with interchangeable surgical tools. 17th international meeting of CARS, London, UK Masamune K, Kobayashi E, Masutani Y, Suzuki M, Dohi T, Iseki H, Takakura K (19 9 9 , Development of an MRI-compatible needle insertion manipulator for stereotactic neurosurgery. Journal of Image Guided Surgery 1: 242-248 Suhm N, Messmer P, Jakob AL, Regazzoni P (2000), Navigationssysteme in der Unfallchirurgie - ein Uberblick. OP-Journal 16: 144-149 Federspil P, Stallkamp J, Plinkert P (2001), Robotik - eine neue Dimension in der HNO-Heilkunde. HNO 49: 501-513 Luth TC, Bier J (1999) Robot Assisted Intervention Surgery. In: Gilsbach J and Stiehl H (Hrsg.) Neuronavigation - Neurosurgical And Computer Scientific Aspects. Springer Verlag, Heidelberg, S. Visarius H, Nolte LP (1997), Computergestiitzte Chirurgie: Planung, Simulation, Navigation. Das Krankenhaus 335-338 Honl M, Schwieger K, Gauck CH, Carrero V, Dierk 0, Dries S, Hille E, Morlock M (2003), Comparison of Robotic versus manual Implantation of cementless Primary Total Hip Replacement - a Prospective Clinical Study. 49th annual meeting of the Orthopeadic research society, New Orleans
OPTIMISATION OF THE ROBOT PLACEMENT IN THE OPERATING ROOM Pierre MAILLET'-*,Philippe POIGNET', Etienne DOMBRE' 'LIRMM UMR 5506 CNRS / UniversitC Montpellier I1 161 rue Ada, 34392 Montpellier Cedex 5, France {maillet,poignet, dombre}@lirmm.fr 2MEDTECH S.A., Site EERIE, Parc G. Besse, 30035 Nfmes, France Placement is a recurrent problem in robotics that influences robot performance and may have dramatic outcomes on safety. Especially in the operating room (OR), very quick procedures should be provided to facilitate the robot installation and removal. We present in this paper an original approach to robot placement based on interval analysis, that offers several advantages compared with more conventional algorithms. It is validated in simulation with a three degree-of-freedom planar robot. This research work is performed in the frame of a project aiming at developing a surgical robot for orthopaedic surgery such as total knee replacement.
1. Introduction Placement is a recurrent problem in robotics that influences robot performance and may have dramatic outcomes on safety. Especially in the operating room (OR), very quick procedures should be provided to facilitate the robot installation and removal. Practically, when installing a robot, the surgical staff has to deal with several constraints due to the patient morphology, the surgical procedure athand but also due to regulation issues and environmental conditions (the OR is cluttered with many people and several apparatuses, regarded as obstacles by the robot). Moreover, it is well known that the dexterity of a robot is not constant throughout its operational workspace. Namely, fine motions should be better performed along or about selected directions rather than others. Other constraints have also to be considered: for several surgical gestures, the instrument has to track a continuous path, the robot should then be positioned such that all the points are reachable with the desired orientation of the instrument, and that the path is singularity free. Besides, the repositioning of the robot during an operation should be avoided if possible as it is time consuming. All these reasons strengthen the need for optimizing the robot placement with
383
respect to the environment, the surgical staff, and the patient for the surgical gesture to be performed. However, we have to face an overconstrained optimisation problem since many constraints are conflicting. Thus and some trade-offs have to be found. Several solutions have already been proposed for industrial robots. In high productivity applications, cycle times have to be shortened. For applications such as spot welding, engineers want also to minimize the number of robots on the line, which means that each robot should cover a number of welding points as large as possible. In [l], the authors present an algorithm based on two heuristic search methods: simulated annealing and genetic algorithm. Pamanes [2] uses a non-linear search optimization technique to find the best design parameters of a robot and the optimal placement to achieve a predefined task with a maximum dexterity and minimum arm length. In surgical robotics, the objectives are obviously different, since safety rather than time is of high priority. A few works have been reported in the literature so far. In [3], the authors present an integrated planning and simulation software for cardiovascular mini invasive surgery (MIS) based on a semantic description of the intervention to find the best port placement and robot placement. This is done sequentially after that the surgeon has validated the port placement proposed by their algorithm. Criteria such as reachability from an admissible point and dexterity are maximized. Abdel-Malek [4], [5] proposes a numerical method based on maximizing the dexterity at specified target points defined by surgeon. Engel [6] presents an algorithm which provides the possibility to preoperatively determine the best relative positioning of robot and patient in craniofacial surgery. A robot manipulator assists the surgeon when drilling or milling the skull bone according to preoperatively planned cut paths. Criteria are feasibility of the path, maximum distance to singularities, no collisions between robot arm segments or between robot and patient and surgical access path. This method is accurate, efficient and safe but time-consuming. We present in this paper an original approach based on interval computation to solve the robot placement optimization written as a constraint satisfaction problem (CSP). This approach, illustrated in the context of orthopaedic surgery, offers several advantages compared with the aforementioned conventional algorithms: it fits well to solve the class of CSP to which belongs the robot placement problem; it is quite easy to implement; it gives a set of solutions instead of a single one that makes it possible to assess the robustness of a chosen robot placement in terms of safety. In the next section, the theoretical background of interval analysis is summarized. In the last section, the context of the orthopaedic surgery and the application are
385 described. An example is proposed on a three degree-of-freedom (dof) planar robot. 2.
Interval analysis
Interval analysis is a particular approach of computation on sets initially developed for computers to quantify errors induced by the rational representation of real numbers [7]. For many years, algorithms based on interval analysis have been developed to solve CSP. We recall in this section some definitions of interval arithmetic. 2.1. DeJnitions
A set [XI, a real interval, non-empty, closed and bounded subset of real numbers R, is defined b j
= [x-,x+] = {x E R 1 x- I x 5 x')
[XI
(1)
Operations on numbers and Booleans are extended to sets as follows:
XOY E {xoy x E x,y E Y }with 0 E { +, -, *,-/} .-For example:
[x, + [v,.]
=
[x + 2,x + Y ]
Y = f ( X )= { f ( x )1 x E
x}
(2)
(4)
and the reciprocal image ((Figure 1) of Y is given by:
x = f - ' ( Y ) = {x E X I f ( x ) E Y )
(5)
[XI
Figure 1. Reciprocal image
2.2. Inclusion function
+-
Given a set [X] and a function on real F : R" R"' , its natural extension to intervals can be obtained by replacing the real variables by their interval counterparts: [F ] : IR" IR" . [F]is an inclusion function (Figure 2) of F if:
+
386
i _
X1
Yf Figure 2. Inclusion function
[ F ] is one of the various inclusion functions and [F]’ is the minimal inclusion function.
3.
Applications
3.1. Context Total knee replacement (TKR) surgery, also called knee arthroplasty, is one of the most successful surgery done today. The most common reason for needing knee replacement surgery is osteoarthritis that causes deterioration of the knee cartilage. When it happens, the knee joint becomes inflamed, and eventually the femur and tibia bones rub together. Other reasons for TKR include rheumatoid arthritis and arthritis that results from knee injury. The surgeon replaces severely damaged cartilage tissue with a prosthesis made of metal and plastic that duplicates the function of the knee joint. Today, more than 265,000 TKR are done annually in the United States [9]. Since the 1970’s, the technology and long term success of knee replacement surgery have been improved dramatically, providing relief to people with chronic, debilitating knee pain. Some orthopaedic surgeons do both traditional as well as computer-assisted TKR surgery. MEDTECH SA, an engineering French company, develops a surgical robot to assist surgeon during TKR. One objective of this project is to provide the surgeon with a procedure computing automatically the best OR organisation, according to the anatomical data of the patient, the robot features and the clinical requirements set out by the surgeon. Figure 3 illustrates a very common
387 configuration of a patient and the robot during the a TKR operation. Among the other assignments, the robot should be used has to reach differents on the tibia. Figure 4, shows the difficulty to find the best robot base location from which the distal region of the tibia near the ankle and the proximal region near the knee are attainable, while the patient is lying on the operation table. This requires first to translate these specifications into mathematical variables suitable for optimization, then to compute coherent domains for each variable. As aforementioned, this problem is expressed as a CSP and solved with interval computation.
Figure 3. Patient position during TKR
3.2. Simulation The objective is to find the best robot placement providing the required dexterity, avoiding singularities and such that the robot workspace includes the target region predefined by the surgeon. A preliminary study has been done with a 2D model and a 3-dof manipulator. The set of solutions will contain the position where the robot can be placed in order that its end-effector reaches all points situated in the target region including the points predefined by surgeon. In Figure 4 the robot base is placed to work on the best configuration, its workspace including the two regions of interest for this application. Base robot
Robot workspace W b r 4
\ Target spaces
-
Table
-
Patient
-
b%"
___x
Figure 4 Robot in operation position
388 The approach is composed of three steps. First, the surgical staff places the table in the room and the patient on table. The surgeon defines some points on the tibia that the robot will have to reach during the operation. The simulator computes the corresponding “target spaces” which include points with a tolerance: a target region is then defined as a set. In the second step, the manipulator workspace is computed from the extension to the interval counterpart of the direct geometric model (DGM) expressing the Cartesian coordinates of the end-effector as a function of the joint variables. Figure 6 shows the result of the computation of the natural inclusion function of the DGM applied to the set [ q ]5 (qi in I qi I qimax)defining the joint limits. The obtained box [x] = I xi I Ximax) is an outer approximation of the workspace and will be used as the initial box for the following step which is the bisection phase [7][8].
Fimin
Figure 5 . Workspace bisection
This last procedure is applied on [ x r in order to find an inner approximation of the reachable workspace guaranteeing the accessibility constraint and providing a dexterity better than a given value. The result is a list of boxes which are fully included in the workspace and which have dexterity higher than a threshold. Grey boxes shown in Figure 5 are undetermined, since they are not fully in or out the workspace, and they are considered to be too small to be bisected once again. Black boxes are the solutions: The robot positioning phase may start. Given the relative position AX and r\y between the robot base and the box solutions, the robot placement (xSOl, ySOl )is computed such that the solution regions (figure 6) include the anatomical target region.
389
Solution regions
( [ x , 1, [ Y , I )
Ax U
Anatomical target region
([X,l>[Y,l) ~~
Figure 6 . Placement computation
Xs*[= x* - A x
(7) Ysd
' Y t -AY
The resulting placement solutions are boxes centered in (xSol,Ys,,) that represent the set of positions from which the robot can reach all points in the target region. This processing is repeated twice on the robot workspace for both right and left elbow joint configurations which allow to reach respectively the proximal (near the knee) and distal (near the ankle) parts of the tibia. The final solution is the intersection between black boxes solutions for left elbow configuration and gray boxes solutions for right elbow configuration (see in Figure 7).
a w
cT;1
5;;i
Figure 7. Placement solutions
4.
Conclusion
We have presented a method for computing optimal placement robot in operating room, based on interval analysis. In this paper, applications with reachability and dexterity criteria are described but this approach may be generalized and several criteria will be integrated in the future.
390 The next objectives are then to perform the search with new criteria, such as maximal distance from singularities or surgeon comfort. And to extend the simulation to the 3D case.
References 1. D. Barral, J.P. Perrin, E. Dombre and A. Liegeois. “An evolutionary simulated annealing algorithm for optimizing robotic task point ordering”. Proceedings of the 1999 IEEE International Symposium on Assembly and Task Planning, Port0 Portugal, July 1999. 2. J.A. Pamanes G., J.P Montes H., E. Cuan D.,C.F. Rodriguez H., “Optimal placement and synthesis of a 3R manipulator”, International Symposium on Robotics and Automation (ISRA 2000), Monterrey, Mexico, November 2000. 3. E. Coste-Manibre, L. Adhami, R. Severac-Bastide, A. Lobontiu, , J.K. Salisbury Jr, J.D. Boissonnat, N. Swamp, G. Guthart, E. Mousseaux, A. Carpentier, “Optimized Port Placement for the Totally Endoscopic Coronary Artery Bypass Grafting using the da Vinci Robotic System”, Proc Intl Symp Exp Rob. Springer, December 2000. 4. K. Abdel-Malek and W. Yu, “Placement of robot manipulators to maximize dexterity”, International Journal of Robotics and Automation. 5. K. Abdel-Malek, “Criteria for the locality of a manipulator arm with respect to an operating point”, IMEChE Journal of Engineering Manufacture, V01.210(1), pp.385-394, 1996. 6 . D. Engel, W. Korb, J. Raczkowsky, S. Hassfeld, H. Woerm, “Location decision for a robot milling complex trajectories in cranofacial surgery”, International Congress Series, 2003. 7. L. Jaulin, M. Kieffer, 0. Didrit and E. Walter. Applied Interval Analysis. Springer, 200 1. 8. A. Fuhrmann and E. Schomer, “A general method for computing the reachable space of mechanisms”, Proceedings of DETC ’01 ASME 2001 Design Engineering Technical Conference and Computers and Information in Engineering Conference, Pittsburgh, PA, September 912,2001. 9. Mayo clinic website: /w.mayoclinic.org/kneereplacement-jax/
SAFETY OF SURGICAL ROBOTS IN CLINICAL TRIALS
W. KORB, R. BOESECKE, G. EGGERS, B. KOTRIKOVA, R. MARMULLA, N. O’SULLIVAN, J. MUHLING, s. HASSFELD* Department for Oral- and Craniomaxillofacial Surgery, University Hospital Heidelberg, INF 400, 69120 Heidelberg, Germany E-mail:
[email protected] D. ENGEL, H. KNOOP, J. RACZKOWSKY, H. WORN Institute for Process Control and Robotics University of Karlsruhe, Kaiserstrasse 12, Building 40.28, 76128 Karlsruhe, Germany
This paper emphasizes the importance of applying systematic approaches for quality management, also for clinical trials. In this paper ideas from the literature for the improvement of safety in surgical robotics are shown. Further methods are explained for the verification and validation of surgical robots to assure the safety in clinical trials.
1. Introduction Surgical robot^'?^>^ are complex mechatronic systems. It is important to pay special attention t o the design and manufacturing of prototypes including hardware and ~ o f t w a r e ~ >It~ is > ~important . t o apply systematic approaches for quality- and safety management7>*. For medical device manufacturers, who want to develop and distribute robots, it is natural to implement a quality management system and consider the special requirements for such complex systems. In universities and research institutions it is often not possible to apply the same measures for software design and error analysis. Nevertheless, these considerations are very important for clinical trials. *Work partially supported by Collaborative Research Centre 414 (Deutsche Forschungsgemeinschaft, DFG).
391
392 2. The seven surgical robot risks
By analysing different surgical robot systems (both from own research and the literature), it is possible to identify the following problematic points we call them the seven surgical robot risks (7 SRRs): (1) Image Processing and Planning: The main problem in robotic surgery is not the precise execution - though robots work very accurately - it is the precise planning. This point is mainly reduced to the limited accuracy in image processing as well as the arising problems of quantitative valid segmentation and 3D-modelling. Other problems arise, if 3D-models are not needed, because the execution respective planning is executed in the "raw" image slices (as with biopsy robots or telemanipulators). In such cases the accuracy is dependent t o the calibration procedure as well as the tracking of the imaging device. (2) Registration: Registration of the robot coordinate system and the patient position is one of the key aspects of such procedures and is always accompanied by accuracy errors. So far often landmarkbased methods have been used. This causes pain and discomfort for the patients if artificial landmarks are used, and often poor precision if anatomical landmarks are used. Surface-based methods are available in some research projects. If camera based methods are used for registration or as redundant supervision system3v9, besides the accuracy problems, registration has practical problems, such as the line-of-sight problem (i.e. the tracker body and the camera line should not be interfered). (3) Movement: This is naturally another very important point. If industrial robots are used, the speed and power have to be reduced in order t o do not harm the patient or surgeon. If special designed surgical robots are used, the speed is normally specified as low, because different to industrial applications, the main goal is not time reduction in surgery, but minimally invasiveness and accuracy. There are two possible modes of movement (sometimes both are implemented in a single system, sometimes only one of these): (a) Automatic movement, initiated by the controller and executed according the pre-planned trajectories. Here critical points are especially the collision with persons or equipment and a proper (accurate) path execution (including path planning). Therefore the movement is normally supervised by different sensors (such as force-
393
(4)
(5)
(6)
(7)
torque sensors, which detect collisions, etc.) and a confirmation button, pushed by the surgeon t o confirm m o ~ e m e n t ~ (b) ? ~ .In hand controlled mode the robot is guided by joystick or directly by pushing or pulling the instrument (with the aid of a force-torque sensor). It is very important to design the hand controlled mode intuitively and restrict the movements as slow as possible, so that the surgeon can follow the movement by eyes and react appropriately. Reliability of control software: Quality management for safe and reliable software has to be introduced from the very beginning of the development, this is true also in research projects, appropriate methods are shown by Varely’ or Ng6. Vigilance: A protocol for behaviour in emergency case is essential before the first patient trial is executed. Such emergency plans are familiar t o the surgeon, but the robot equipment has t o be considered additionally. It has to be ensured that the cumbersome robot equipment can be removed quickly in case of emergency, because otherwise it is not possible t o reach the patient with the emergency equipment sufficent. The removal of the robot is especially challenging, if an instrument is introduced into the patient. Hygienic Considerations: As for all medical devices, methods for cleaning and sterilisation have t o be considered, which include the robot, the tools, but also the input devices (e.g. keyboard and mouse or touch-screen) Clinical workflow: Computer- and robot-assisted surgery always affects the clinical workflow. This includes pre-operative steps, such as surgical operation planning, but also intraoperative steps, such as sterilisation as well as other safety-related steps like fixation of the patient, patient and robot supervision, registration, etc. An intuitive and fluent workflow is indispensable. The often complex planning and intraoperative handling reduces the acceptance of such equipment, but also plays an important role in terms of safety.
3. Methods of Risk Analysis
To analyse the discussed and other minor problems (hazards), it is recommendable t o use a systematic approach7. One possible approach is a fault tree analysis (FTA) (including the analysis of error severity and error probability) - see the standards IEC 601-1-41° and IS0 1497111. The fault tree allows t o analyse the possible hazards by reducing the hazards t o
394 single sub-hazards, and these sub-hazards to further sub-sub-hazards, etc. Another approach is the use of an event tree (ETA), which is a bottom-up approach. In ETAS in difference t o FTAs, in which hazards are the root event - the failure event is the root of the tree and the possible events induced by the failure are investigated. If it is desired t o investigate besides error severity and occurrence probability also the detection probability, a failure mode and effects analysis (FMEA) is suggested, which yields in the so called risk priority number (determined from severity, occurrence and detection rate). ~
~
~
frequent
in nearly every use
feasible
occurs regularly, especially induced by other failures
Occasional
occurs sometimes
conceivable
m u r s seldom, in extreme examples
unlikely
never occured so far, will probably never occur
errors
no probability can be determined: software of user failure lcf. IEC 6011
Figure 1. In the risk analysis of mechatronic systems, it is often not possible t o determine absolute numbers for the incidence of an error. Therefore categories are used for the error occurrence probability, as well as for the severity of an error.
Independent of the applied tool (FTA, ETA or FMEA), risk management is applied as an iterative process:
(1) Detect hazards (risks) (2) Devise actions for risk control (3) Implement risk minimization (if possible) (4)Verify if the residual risk is ”acceptable”? ( 5 ) Detect whether new hazards are generated (6) Restart the procedure. It is well known that this process has to be continued until the hazards are ”as low as reasonably practicable” ( ALARP-Principle). The risk minimization can be performed by three methods: (a) design of the system, (b) protection mechanisms and (c) warnings. These methods are applied in the given order. Only as the last option, if no reasonable design or protection mechanism can be found (in view of cost and effort), it is admissible to give ”just” warnings. These warnings are given in the user’s manual or on
395 labels directly attached t o t h e robot system. 4. Conclusion
It is important t o apply dedicated methods for design and prototyping in surgical robotics to develop safe systems. Further systematic methods for risk analysis and management need to be applied in order t o assure the proclaimed safety. Such methods help to improve the system and are a prerequisite for clinical trials.
References 1. T. Lueth and J . Bier. Robot assisted intervention in surgery. In J.M. Gilsbach and H.S. Stiehl, editors, Neuronavigation - Neurosurgical and Computer Scientific Aspects. Springer, 1999. 2. R.H. Taylor, S. Lavallke, G.C. Burdea, and R. Mosges, editors. Computer Integrated Surgery, Cambridge, Mass., 1996. MIT Press. 3. D. Engel, J. Raczkowsky, and H. Worn. A safe robot system for craniofacial surgery. In International Conference On Robotics And Automation (ICRA), pages 202C2024, Seoul, Korea, June 2001. 4. B. Davies. A review of robotics in surgery. Proc Inst Mech Engrs, 214(Part H):129-139, 2000. 5. Peter Varley, Techniques for Development of Safety-Related Software for Surgical Robots, IEEE Transactions on Information Technology in Biomedicine, 3(4), 1999, 261-267. 6. W.S. Ng and C.K. Tan. On safety enhancements for medical robots. Reliability Engineering and System Safety, 54:35-45, 1996. 7. Korb W, Engel D, Boesecke R, 0 Sullivan N, Raczkowsky J, Marmulla R, Hassfeld S. Risk-analysis for a reliable and safe surgical robot. In: Lemke HU, Vannier MW, Inamura K, Farman AG, Doi K, eds: Proceedings of the 17th International Symposium and Exhibition on Computer Assisted Radiology and Surgery (CARS 2003). Amsterdam: Elsevier, 2003: 766-770. 8. J. Schneider. Risikomanagement in der Medizintechnik. Berichte aus dem Institut fuer Medizinische Physik (W. A. Kalender, ed.), Aachen, Germany, 2000. Shaker. 9. P. Knappe and I. Gross and S. Pieck and S. Kuenzler and F. Kerschbaumer and J. Wahrburg. Design of an interactive navigated robot for surgical interventions. In: Troccaz J , Merloz P eds.: Computer-Adided Medical Interventions: tools and applications, Surgetica 2002, Grenoble/France. Montpeller, France: Sauramps medical, pp. 83-89, 2002. 10. IEC 601-1-4: Medical electrical equipment, Part 1: General requirements for safety; 4. Collateral Standard: Programmable electrical medical systems (1996). (German: DIN EN 60601-1-4)
396 11. I S 0 14971: Medical devices - Application of risk management to medical devices (2000). 12. C.R. Maurer Jr. and R.J. Maciunas and J.M. Fitzpatrick. Registration of head CT images to physical space using a weighted combination of points and surfaces. IEEE Trans Med Imaging 17(5), pp. 753-761, 1998.
ROBOTICS IN HEALTH CARE. INTERDISCIPLINARY TECHNOLOGY ASSESSMENT INCLUDING ETHICAL REFLECTION MICHAEL DECKER Institute for Technology Assessment and Systems Analysis Research Centre Karlsruhe PO 3640,0-76021 Karlsruhe, Germany
Technology Assessment is a problem-oriented endeavour dealing with political, societal, ecological, etc. problems. Usually the perspectives of different scientific disciplines have to be combined in order to develop interdisciplinary based recommendations for action. The role of ethical reflection in this process depends on the problem situation. Whenever a technical application is on the agenda which cannot be allocated to a so-called “business-as-usual case” one would ask for ethical reflection. Two case studies dealing with robotics in health care are mentioned as examples for problem-settings, in which interdisciplinary TA succeeds in developing discipline-crossing argumentation chains.
1. Introduction New technologies influence our everyday life. Once introduced to the real world one can identify those people who profit from them, those who are not/less affected, and those who have disadvantages”. Therefore decisions about new technologies contain decisions about who should win and who should loose. Conflicts are foreseeable. Political decision makers are asking for solutions acceptable for the voters. Technology Assessment (TA)b contributes to the problem solving by evaluating different options, sorting out various arguments, etc. Thereby TA is answering questions like “in which future society do we want to live?” or more normatively “in which future society should we want to live?’ The societal problems in connection with new technologies need in general an interdisciplinary TA approach, because only in rare cases one single scientific discipline is able to develop the problem solutions alone. Mostly technical, economical, ecological, legal, ethical and other aspects have to be taken into account. Moreover the development of problem solutions needs intense cross~
A new city highway for example is connected with advantages for those who live in the countryside and work in the city. People who live next to the new highway have disadvantages because they are exposed to the pollutants and to the noise and their plot of land looses value. Citizens who live in the opposite part of the city are less affected. See for a definition of TA in this sense i.e. focussing on the social relevance[ 11
a
397
398 correlations between the scientific disciplines, because the argumentation chainsc leading to acceptable solutions are interdisciplinary themselves, i.e. they are a combination of many disciplinary (partial-)arguments. In this contribution this will be demonstrated on the case study of autonomous robots in health cared. Especially in the field of health care many robot applications are developed and some of them have already reached maturity of market. A problem-oriented health technology assessment has first of all to deal with technical aspects. Are the robots able to do what they are supposed to do? Usually the developer of a robot defines the technical criteria the robot has to fulfill together with an expert for health care; the latter will also be involved in checking the capabilities of the prototype. In addition to that economical aspects have to be taken into account. Is the overall performance of the robot cheaper than the comparable up to now chosen achievement? This is typically done by an extensive cost-benefitanalysis. Moreover legal aspects have to be considered due to liability aspects. Who is responsible for malfunctions of the robot? This becomes relevant especially if the robot is equipped with a learning algorithm. Finally, ethical aspects have to be taken into account, because the treatment and the nursing of patients, elderly people, handicapped etc. is deeply rooted in our social behaviour and therefore one has to ask in which areas we, as a modem society, should go for the replacement” of previously human tasks by robots and in which areas we just do not want a robot act instead of a human being. In the following the above sketched need for interdisciplinary technology assessment will be underlined by referring to two examples from health care in detail. Subsequently, some answers are presented which demonstrate the interdisciplinarity of the argumentation chains. Thereby the role of ethical reflection is stressed. 2. Two Examples‘
The first case study is ROBODOC, developed by ISS (Integrated Surgical System, Inc.). The ROBODOC Surgical Assistant System is intended for use in patients requiring primary cementless total hip replacement surgery. The
These argumentation chains are mainly if-then-clauses. This example refers to the project “Robotics. Options for the Substitutability of Humans” organised by Europaische Akademie GmbH [2] By using the notion “replacement” it is assumed that the task to be fulfilled by the robot has up to now been done by humans. f The aim of these examples is to demonstrate that there are questions to be answered from different scientific disciplines. The complete argumentation chains including the recommendations for action can be found in [2].
399 ROBODOC System requires the use of the ORTHODOC Preoperative Planning Workstation. A computed tomography (CT) scan of the patient’s femur is loaded from a tape into a workstation allowing the surgeon to position a 3-D representation of a selected prosthesis within a femoral image created from the CT data. After the surgeon has selected a prosthesis and designated its location, the coordinates of the prosthesis relative to landmarks on the femur are calculated. Three titanium pins, implanted prior to the CT scan, serve as the landmarks for spatial orientation. Prior to hip replacement surgery, data from the preoperative planning session are transferred to the ROBODOC Surgical Assistant. The ROBODOC Surgical Assistant consists of a robotic arm with a distal high-speed milling bur and is controlled by a computer containing ROBODOC control software. The robotic device features a base with wheels, which is positioned next to the operating table. It has a bone clamp fixation device that prevents movement of the femur. After incision, exposure and removal of the femoral head, and fixation of the femur, the device mills the femoral canal to create a cavity of the appropriate size and shape for the selected femoral prosthesis. The device offers the ability to complete preoperative planning of the femoral implant procedure including surgeon-controlled selection of prosthesis size, shape, and placement and controlled milling of the femur in preparation for a prosthesis, replacing the use of a manual surgical broach and hand-held milling toolsg. Concerning technical aspects one would check whether the above mentioned skills are realisable with ROBODOC and one would compare the overall capability with the so called standard treatment, i.e. with a surgeon, who mills the hole by hand. The criteria to scrutinize the technical performance would refer to aspects like “How good is the contact between the surfaces of the bone and the prosthesis?’, “How exactly was it possible to reach the in advance planned optimal position of the prosthesis?’, “Is the patient treated with care during the operation process?’, “Is the new treatment more risky than the standard treatment?’, etc. An economical perspective would ask for cost-benefit-analysis. Are hip replacement surgeries with ROBODOC cheaper than those without robot assistance? How differs the surgery duration? Is the same amount of human assistance (assistant doctors, operating room nurse, etc.) required? Is a longer durability of the artificial hip achievable? From a legal point of view mainly liability aspects are relevant. The surgery is planned by the surgeon but the execution is done by the robot equipped with the control system of the robot producer. Is there a problem with shared responsibilities? Legal aspects could This description can be found at http://www.orthopaedics.northwestern.edu/orthopaedics/research/robodoc.htm
400 also refer to emergency performances of the robot. How is the robot manageable in case of power breakdown, disruption of the data line, etc? From an ethical perspective the question has to be answered whether the introduction of a semiautonomous robot into an operation room opens a new category of technology or is this still within the frame of “business as usual” [3]? or “is the relation between physician and patient varied in a virulent way by the introduction of the robot?’ The second case study is fictive but based on Care-0-bot [4], a prototype of a multi-functional robot assistant for housekeeping and home care, to be used by elderly or handicapped people in order to live independently in their homes for a longer time. Therefore, an easy, intuitive, and dependable operation of the home care system was realised. Care-0-bot is able to cope with many different situations and fulfils complex tasks even in dynamic environments. Furthermore, the robotic assistant is able to execute not only single tasks at a time but several tasks concurrently. The further developed Care-0-bot I1 is equipped with a manipulator to perform household tasks such as fetch-and-carry duties, setting the table or basic cleaning. Moreover it is a mobility aid which enables the patient to move behind the robot, to keep the patient grounded. In addition to that Care-0-bot is a communication tool. It includes a camera system and a video phone. In order to create our case study one can imagine a combination with a patient supervisory tool put in a finger ring, which detects medical data like blood pressure, pulse frequency, oxygen concentration in the blood etc.. These data can be transferred automatically to the next hospital or the family doctor. The robot can be equipped with a learning algorithm, which enables him to manoeuvre in an unknown environment. This learning algorithm could also include an adaptive human-machine-interface in order to realise and learn about individual behaviour of “its” patient. In addition to these technical criteria economical aspects must be taken into account. Due to the fact that nursing in recuperation houses is extremely expensive, even an expensive robot borrowedleased from the health insurance, would have the potential to save money of the health care systemh. Taking into account the wish of many elderly people to stay as long as possible within their social setting, one could expect a win-win-situation. As already mentioned, the legal perspective on this scenario refers mainly to liability aspects. Who is reliable for malfunction of the robot? Due to the learning algorithm there might be a conflict between robot producer and robot owner. The robot producer builds a learning algorithm which defines several corridors of behaviour of the Personal communication with Professor Dr. Dr. Karl Lauterbach, Institute of Health Economics and Clinical Epidemiology, University of Cologne
401
robot. The robot owner trains the robot to his own use which makes on the one hand side the robot actions unpredictable for the robot producer. He would reject liability for malfunction due to wrong training. On the other hand the robot owner is not an expert in robotics. He also does not want to be responsible for malfunctions of the robot. Ethical reflection would focus on the nursing scenario in general. Up to now it is unusual to transfer nursing tasks to robots at all. Nursing is a typical task to be fulfilled by human beings. It remains the question whether a modern society should and wants to hand over some tasks of nursing to robots.
3.
Interdisciplinary Argumentation Including Ethical Reflection
That ethical reflection plays a role in TA in general can be justified referring to the fact that technical actions nowadays are actions under uncertainty and inequality [5,6 and in this context 71. Therefore decision making is confronted with problems concerning the fairness of distribution of chances and risks and concerning actions under uncertainty. Because usually the “winners” (beneficiaries) of a new technology are different from the potential ‘‘losers’’ (affected from unintended consequences) (cf. Footnote a) [8]. Technology Assessment is dealing with new technologies in a means-end relation, i.e. a new technology is developed or should be developed to reach a particular end. During the discussion one compares different means referring to their intended and unintended consequences. In addition to that also scrutinizing of the end itself is of crucial importance. Concerning their role in the interdisciplinary discussion described above all scientific disciplines are equal. However, on the second sight it becomes obvious that there are some differences concerning their inputs: Ethical reflection deals mainly with the ends of technology development. Contributions to answer questions like “Which technologies could be developed, which should be developed and which should not be developed?’ are made by professional ethics. Usually the discussion about the ends should take place prior the debate on different means to reach these ends, because the potential to reach the end is a strong argument in the discussion on means. Therefore one could argue that ethical reflection should take place at the very beginning of the TA project and sets the unchangeable boundaries for the interdisciplinary discussion. Later on one “only” discusses about the means to reach these ends. However, during the discussion about ends it becomes obvious very soon that there are conflicting ends in the debate. The ends to be reached by new technologies are in the focus of criticism themselves 191. In order to come to rational decisions ethical reflection has to justify the ends to be reached by new technologies. This
endeavor needs the input of and is deeply cross-correlated with the other scientific disciplines as can be shown in the following examples. Having autonomous robots assisting in the operation room seems on the first view not that extraordinary. An operating room is equipped with high-tech inventory like e.g. a heart-lung machine, anyway. Does a semi-autonomous robot which mills a hole in a bone make a special difference there? Questions like these have to be answered referring to technical and legal aspects. Where are the differences between a hand-held mill and a robot controlled mill? Technically viewed one compares criteria like “push foreword pressure” and “accuracy”, “average probability of bone damaging” and “quality of the transfer of the 3 Doperation plan to the surgery itself’. The legal aspects refer to these technical aspects. They focus on the liability in case of malfunction. The high resolution 3D-data of the CT are used by the surgeon in order to plan the surgery. This plan has to be fulfilled by the robot during the surgery. To make sure that there are no discrepancies between the operation plan and the factual milling of the robot both the plan and the robot control system must be recorded. One can imagine a kind of black box within the robot. In case of malfunction it can be identified whether the failure was originated by the planning or by the robot, which means by the robot producer. In addition to that a kind of emergency automatism might be necessary. This allows for example an emergency stop during operation and the pulling back of the robot from the patient manually without causing additional damage. If these aspects are taken into account the use of ROBODOC can be seen as a “business as usual” case from the ethical point of view [3]. Another point touches the economical perspective. Up to now an overall costbenefit analysis is not possible due to the fact that a manually milled hip prosthesis lasts around 15 years and the overall duration of use is a crucial factor in the benefit analysis. The costs of the surgery with robot assistance are higher and the operation takes about 15 min. longer. The hope of a longer duration of use is mainly based on the better contact between prostheses and bone. But one can also find some less optimistic statements (ZEIT Nr.20199, S. 42 und Nr.32199, S. 29; Frankfurter Rundschau Nr. 133 12/6/99, S. 6). Autonomous robots in geriatric nursing are not in our “business as usual” scenario concerning the nursing of elderly members of society. Up to now this task has been taken over by human caregivers and the introduction of robots in the relationship of patient / elderly people and nursing staff needs ethical reflection. Of course the best solution would be to realize a care taking by humans which takes into account the individual needs and the autarky and the dignity of the aged people. However, as inquiries have shown, the nursing by humans is sometimes not very “human” [lo]. Moreover there is a decrease of entrants in geriatric nursing (Siiddeutsche Zeitung 3.1.2002). Nursing by humans
403 is very expensive and there is the additional danger that relations of dependency can be misused. Within the market of health care the cost aspect plays an important role due to the overall limited budget (as for example in Germany). Within an exhausted budget the justification to consider a more expensive treatment than the standard treatment argues not only against this present standard treatment (concerning the same disease) but against all the treatments of all other diseases. This is a typical question of fairness of allocation. Due to which reasons should it be possible then to reject a technical solution with an enormous health care economical potential? This is a typical conflict on the level of ends. Another aspect the ethicist has to consider for example is the weighting between the prohibition of instrumentalization of human beings against the gain of increased autarky. The latter can be achieved by the cooperation of the human being and the robot, which is not a human being and therefore available all the time and for unlimited time. Especially younger patients or handicapped people welcome the increased independence from other peoples or nursery staff. The former takes into account that the human being might be instrumentalized within this cooperation. Nursing is a very intense relationship between human beings (caregiver and patient) especially in long term nursing. Within such a relation it might happen that the nurse has to guide the patient. Transferred to a humanrobot relationship this would mean that the robot reaches a kind of control over the human’. This contradicts the universal agreement on the prohibition of instrumentalization of human beings’. These case studies demonstrate that several scientific disciplines contribute to a common chain of argumentation, which requires intense interdisciplinary discussionk. By that it is also realized that the ethical reflections “keep contact” to the real world problems. The other scientific disciplines take care for the relevance of the ethical reflection [9]. Applied ethics in TA deals with ethics of technical action [12]. The detailed understanding of and dealing with the technical, economical and legal implications of this technical action is a necessary prerequisite for that.
’ It is not self understandable that the human is the top within the control hierarchy of the cooperation-system human-robot. Especially in the case of autopilots where many accidents are caused by problems of handling the pilotmachine-interface one thinks about leaving the last decision to the computer/robot.(Bild der Wissenschaft 5/97, p59) j In [2] it has been recommended to use robots in the context of nursing only as tools for the human nurse and as instruments to raise patient’s autarky. One has to avoid the reduction of human nursery staff. A description of the process can be found in [ 111
404
References 1. M. Decker, M. Ladikas, “Bridges between Science, Society and Policy. Technology Assessment - Methods and Impacts”. To be published at Springer Heidelberg et al. (2004). 2. T. Christaller, M. Decker, J.-M.Gilsbach, G. Hirzinger, K. Lauterbach, E. Schweighofer, G. Schweitzer, D. Sturma, ,,Robotik. Perspektiven fur menschliches Handeln in der zukunftigen Gesellschaft“. Springer Heidelberg et al. (2001) 3. A. Grunwald, “Against Overestimating the Role of Ethics in Technology. Science and Engineering”, Ethics 6. pp 181-196 (2000) 4. M. Hans, B. Graf, R.D. Schraft Robotic “Home Assistant Care-0-bot: Past Present - Future”. In Proc. of the 11th IEEE Int. Works. on Robot and Human interactive Commun., ROMAN2002, Berlin, Germany, pp. 380-385. (2002) 5. C:F: Gethmann, ,,Die Rolle der Ethik in der Technikfolgenabschatzung“. In: T Petermann, R. Coenen (eds) ,,Technikfolgen-Abschatzung in Deutschland. Bilanz und Perspektiven“. Campus Frankfurt am Main (1999) 6. C.F. Gethmann, ,,Rationale Technikfolgenbeurteilung“. In: A. Grunwald (ed) ,,Rationale Technikfolgenbeurteilung. Konzepte und methodische Grundlagen“. Springer Heidelberg et al., p 1-10 (1999) 7. M. Decker, “The role of ethics in interdisciplinary technology assessment. Poiesis and Praxis” Int. Journal of Ethics of Science and Technology Assessment volume 2 issue 2/3 march (2004) 8. A. Grunwald ,,Technikfolgenabschatzung - Eine Einfuhrung“. Edition Sigma, Berlin. (2002) 9. A. Grunwald, ,,Technik fur die Gesellschaft von morgen. Moglichkeiten und Grenzen gesellschaftlicher Technikgestaltung“. Campus, Frankfurt (2000) 10. S. Loerzer, ,,Immer mehr Beschwerden. Verbesserungen in der Altenpflege scheitern am Geld“. In: Sueddeutsche.de, 16th January 2003. 11. M. Decker, Grunwald A (2001) “Rational Technology Assessment as Interdisciplinary Research”. In: Decker M (ed) “Interdisciplinarity in Technology Assessment. Implementation and its chances and limits”. Springer Heidelberg et al., p 32-60 12. C.F. Gethmann, Jthische Aspekte des Handelns unter Risiko“. In: M. LutzBachmann (ed) Jreiheit und Verantwortung“. Morus-Verlag Berlin, p 152169( 1991)
Sensor Feed-Back Systems
This page intentionally left blank
PALPATION IMAGING USING A HAPTIC SYSTEM FOR VIRTUAL REALITY APPLICATIONS IN MEDICINE
w. WALED', s. REICHLING', 0. T. BRUHNS', H. BOESE~,M. BAUMANN~, G. MONKMAN3, S. EGERSDOERFER3, D. KLEIN4, A. TUNAYAR4, H. FREIMUTH4, A. LORENZ5, A. PESAVENTO', and H. ERMERT' I Ruhr-University Bochum, Bochum, Germany 2 Fraunhofer-lnstitut fuer Silicatjorschung, Wuerzburg, Germany 3 Fachhochschule Regensburg, Regensburg, Germany 4 Institut fuer Mikrotechnik Maim GmbH, Mainz, Germany 5 LP-IT Innovative Technologies GmbH, Bochum, Germany Abstract. In the field of medical diagnosis, there is a strong need to determine mechanical properties of biological tissue, which are of histological and pathological relevance. Malignant tumors are significantly stiffer than surrounding healthy tissue. One of the established diagnosis procedures is the palpation of body organs and tissue. Palpation is used to measure swelling, detect bone fracture, find and measure pulse, or to locate changes in the pathological state of tissue and organs. Current medical practice routinely uses sophisticated diagnostic tests through magnetic resonance imaging (MRI), computed tomography (CT) and ultrasound (US) imaging. However, they cannot provide direct measure of tissue elasticity. Last year we presented the concept of the first haptic sensor actuator system to visualize and reconstruct mechanical properties of tissue using ultrasonic elastography and a haptic display with electrorheological fluids. We developed a real time strain imaging system for tumor diagnosis. It allows biopsies simultaneously to conventional ultrasound B-Mode and strain imaging investigations. We deduce the relative mechanical properties by using finite element simulations and numerical solution models solving the inverse problem. Various modifications on the haptic sensor actuator system have been investigated. This haptic system has the potential of inducing real time substantial forces, using a compact lightweight mechanism which can be applied to numerous areas including intraoperative navigation, telemedicine, teaching and telecommunication.
1. Introduction
The haptic sensor actuator system has been designed for visualization and reconstruction of mechanical tissue properties using ultrasonic strain imaging (also called elastography) with a haptic display based on electrorheological fluids [l-41. Five German institutes and companies are working in a collaborative project, proposing to develop a haptic sensor actuator system, to support a variety of applications in medicine, education and telecommunication [5]. The 407
408
goals of this development are (1) to develop an accurate imaging t e c h q u e which leads to absolute or a useful relative measurement of tissue elasticity parameters, (2) to improve the visualization of mechanical properties by reduction of artifacts, and (3) to reconstruct tissue properties on a controllable haptic display capable of emulating and differentiating normal and abnormal biological tissue. The general design of the haptic system is presented in the next section of t h s paper. Some newly designed methods and results of the real time ultrasound elastography and the tactile actuator design are described in further sections.
2. Haptic Sensor Actuator System The scheme of the haptic sensor actuator system (HASASEM) which consists of a sensor head and a separate actuator array is shown in Fig. 1. The sensor is based on real time ultrasound elastography. The elastography system is able to detect even small and far surface lesions which are not detectable by manual palpation or conventional ultrasound systems. The sensor part consists of a standard ultrasound system modified for RF-data access, equipped with a 9 MHz linear array probe and sampled by a conventional ADC-card and a desktop PC. The actuator consists of a tactile display, which has a hybrid configuration consisting of micro-machined cells containing smart fluids with electrically controlled rheological properties. Electrorheological fluids are suspensions of electrically polarizing particles in a non-conducting carrier liquid. The effective viscosity of the electrorheological fluid increases with the application of a strong electrical field. This effect is fast, reversible and can be easily exploited in the design and construction of tactile elements. Such haptically explorable arrays require up to 2 kV with very modest current levels only (typically under 20 A per element). The need for electrical isolation between digital control outputs and high voltage inputs leads to an optically controlled element being the device of choice. T h s required voltage level rules out most low cost semiconductor components. Ssnmr~ConcalW I Ullrasonicdev~ and elastography
C
Fig. 1 Scheme of the haptic sensor actuator system (HASASEM)
409
Furthermore, in order to achieve the required stiffness changes, the evaluation of electrorheological fluids and actuator designs together with various modes of operation are being investigated. The actuator array is envisioned to consist of 1024 elements with a spatial haptic resolution of few millimeters, which corresponds to the resolution of receptors on the finger tip. The haptic sensor actuator system will serve as a new base from which the potential can be deduced for various applications including virtual palpation to specify and detect subsurface hard tumors, intraoperative navigation to locate subsurface organs especially in a remote surgery, where the surgeon is incapable of palpating the tissues of interest before they are cut, and virtual training simulation for medical education. 3.
Remote Palpation Based on Real Time Elastography
Palpation is one of the standard screening procedures for detection of breast, thyroid, prostate, and liver abnormalities. The pathological state of soft tissue is often correlated with changes in stiffness, which allows qualitative estimation of tissue elasticity characteristics. However, palpation is not very accurate because of its poor sensitivity with respect to small and deeply located lesions as well as to its limited accuracy in terms of morphological localization of lesions. Elastography a new imaging modality based on ultrasound is of growing interest, because of its capability to visualize elastic properties. In addition to tumors in soft tissue, elastography is able to detect calcifications in blood vessels, for example in the coronary arteries. With this new method (Fig.2) small displacements are determined between ultrasonic image pairs, which are acquired under varying small axial compression, using a cross-correlation analysis of corresponding echo-lines within RF-data sets. The derivative of displacement field is equal to the strain in tissue. The strain imaging system uses the fast phase root algorithm for strain estimation leading to a frame rate up to 40 frames per second [6].
am image
ngld
Fig. 2 A tissue mimicking soft phantom with a hard inclusion is slightly compressed "palpated" using the ultrasound transducer (far left). Because of uniform reflection of the isoechoic inclusion, the rigid body with cylindrical shape in the phantom is not visible in the conventional B-Mode image. The same body is visible in the elastogram (strain image) to the right.
410
In an experiment to reconstruct the elastic properties, as shown in figure 3a, an ultrasonic tissue mimiclung gelatine phantom containing a harder 7 mm diameter cylindrical agar inclusion was imaged using the elastography system and a step motor with a compressor plate to generate a plane strain state. Conhvl unit
Fig. 3 a) Schematic diagram of the experimental setup b) Block-diagram of the integrated software for numerical modeling of elastographic problem
Unfortunately, strain images do not represent quantitative measure of elasticity, because they do not take account of how the stresses are distributed in the medium and therefore provide only approximate measures of tissue elasticity. Artifacts in strain images as shown in Fig. 2 occur due to stress decay and stress concentrations at internal tissue boundaries. Better images can be obtained by exploring the feasibility of reconstructing tissue elasticity within the framework of solving the inverse problem. However, inverting measured responses, such as strain distribution, prove to be difficult due to instability problems and it requires the complete understanding of the equivalent forward problem using a finite element (FE) simulation technique. The simulation procedure for effective modelling of the elastographic problem, as shown in Fig. 3b), contains three composite, and interactive parts, i.e., (1) mechanical tool giving the simulated axial / lateral displacement and stress at each of its nodes after the application of the external compression on the specimen; (2) acoustical tool measuring the acoustic RF-signals and images inside the phantom before and after compression; (3) and image formation technique giving the estimates of the displacement field, strain distribution and elasticity image. The numerical simulations were performed using a standard finite element program. To simulate the mechanical behavior of the tissue-mimicking phantoms, a twodimensional mesh consisting of 2236 eight-node, isoparametric arbitrary quadnlateral elements, designed for incompressible or nearly incompressible plane strain applications, was used.
41 1
.
Phantom width Imm
D
*
L
*
l
m
*
*
-
Phantom wdth Imm
"
.
Phantomwdth Imrn
Fig. 4 a) Sketch of the finite element discretisation with applied boundary conditions. Results of finite element analyses for Einc/Emat= 5 / 1 (b) displacement in y-direction and (c) strain EYY
In order to reproduce the experimental setup, boundary conditions shown inFig. 4a) were applied to the model. All nodes on the upper and lower boundary can move freely in x-direction, whereas the movement in y-direction is prescribed. In all FE-analyses, the Young's modulus of the inclusion was greater than surrounding Material. The Poisson's ratio of both parts was set to 0.495 defining nearly incompressible tissue like Materials [7].
Fig. 5 Experimentally obtained results of (a) displacement distribution (b) strain distribution (c) and calculated logarithmic relative Young's modulus (In (Einc/Emat )) using the measured strain image
The experimentally obtained images of displacement and strain distributions are seen in Fig. 5 according to the same applied boundary conditions using an inhomogeneous gelatine phantom and the reconstruction method. In order to calculate the distribution of elastic modulus from the displacement field, we use a so called direct method proposed by Sumi, Sumki & Nakayama [S]. Referring to this method the equilibrium conditions are treated as linear, partial differential equations for the shear modulus using the measured strains and their derivatives as known parameters. This approach reduces the inverse problem of elastography to a direct problem and leads to an approximation of the elastic or the shear modulus of the inclusion relative to the surrounding material. 4.
Medical Results of Elastography
During the ongoing clinical study at the Urology Hospital of the RuhrUniversity Bochum (Manen-Hospital Herne) on prostate tumor diagnosis, more than 216 patients have undergone clinical examinations. It has been shown, that our system for real time ultrasound elastography is able to detect the prostate carcinoma with a high grade of accuracy reaching a sensitivity of 76% and a specificity of 84%, compared with only 34% using ultrasound B-Mode images alone. Fig. 6 shows an example, where prostate slices with hstological diagnosis
41 2
following radical prostatectomies act as reference. Cancerous areas have been stained and marked on the prostate slices.
Fig. 6 In vivo results of a human prostate: (a) Histology, tuniors have been stained, malign and benign tissue areas have been marked by pathologist. The tumor is in the lower left side (b) B-mode image: tumor nearly not visible (c) Strain image: The tumor is clearly visible as a dark area on the left side. (In cooperation with the Urology Hospital, Ruhr Univ. Bochum, Manen-Hospital Heme)
5.
Electrorheological Fluids as a Basis for the Tactile Display
If the consistency of an inhomogeneous object is simulated on a haptic display, kinesthetic as well as tactile aspects are concerned. This complex haptic phenomenon is relevant in medicine where a physician makes a diagnosis by palpating an organ or tissue of a patient. For simulating the stiffness distribution of an object an array of tactile elements is being fabricated. In order to achieve a high spatial resolution of few millimeters a miniaturized design for each single element has to be achieved. To limit the physical size of the actuator array the tactile elements use an electrorheological fluid (ERF), whose viscosity can be controlled by the application of an electric field. A single cell (Fig. 7a) consists of a pair of parallel electrodes. The space between these electrodes is filled with ERF. A piston can be moved parallel to the electrodes within the ERF. The force to move the piston is largely determined by the viscosity of the ERF.
-
Fig. 7a) Cross section of a single cell with a lamella piston between the electrodes. The movement of the piston is perpendicular to the drawing plane. A single cell covers an area of 2 x 8 mm2. Each cell consists of the opposing sides of two symmetrical polymer modules and a metallic piston of 0.15 mm width. b) An array of 16 cells is shown with two metal pistons inserted into the cells. The housing which contains the single elements is made from Polymethylacrylate. A rubber sealing is used to prevent the ERF from leaking out.
413
For larger arrays a large number of single elements has to be fabricated. The technology of injection moulding offers a cost effective method of producing any needed number of modules from different thermoplastic materials. These can then be stacked to arrays of required dimensions.
0
10
20
30
Piston displacement [mm] Fig. 8 Force measurements on a haptic actuator with a piston head of about 1 cm2 area and including the most convenient ER fluid
Various modifications of the piston have been investigated with best results for a metallic piston [9].For force measurements on different ERF cells a stepper motor driven apparatus has been built. The resistance force increases linearly with the downward displacement of the piston, which is due to an enhancement of the active area of the electrodes. The force depends on the electric field strength applied and on the properties of the ERF. For the most convenient ER fluid, as shown in Fig. 8, a force of nearly 10 N on a larger actuator with a piston head of about 1 cm2 area could be achieved at a field strength of 2 kVImm. The control unit of the actuators with voltages of up to 2 kV and a very low current flow required new switching concepts based on special light sensitive semiconductor material (Galiumarsenid, GaAs) [lo]. The GaAs elements are controlled from a digital output computer interface via an identically sized special LED array. The GaAs array can provide high voltage optically isolated outputs and control the electric field in the ERF. 6. Conclusion
The first integrated haptic sensor actuator system based on ultrasound real time elastography and electrorheological fluids is being developed and tested with interesting applications of virtual reality in medicine. The system has the potential of improving cancer detection and allows a more reliable diagnosis. We combine the new results of elasticity images to reconstruct virtual objects on the actuator haptic screen with improving resolution. The haptic display can allow users to palpate the patient’s prostate or breast, whle imaging and while malung a biopsy.
414
Acknowledgement
T h s work is funded by the German Federal Ministry of Education and Research (BMBF), and the German Aerospace Center (DLR) grant OlIRA14B. Financial support is gratefully acknowledged. The authors thank the Urology Hospital of the Rub-University Bochum, Manen-Hospital Herne (Prof. Dr. Th. Senge ) for their cooperation. References Khaled W., Ermert H., Bruhns O., Reichling S., Bose. H., Baumann M., Monkman G. J., Egersdorfer S., Meier A., Klein D., Freimuth H. - A haptic sensor actuator system based on ultrasound elastography and electrorheological fluids for virtual reality applications in medicine - Studies in health technology and informatics, MMVR 11, Vol. 94 (2003), pp. 144-150 Khaled W., Bruhns O., Reichling S., Boese. H., Baumann M., Monkman G. J., Egersdoerfer S., Meier A,, Klein D., Freimuth H., and Ermert H. - A haptic sensoractuator-system -, Proceedings of the 17th International Congress and Exhibition, Computer Assisted Radiology and Surgery CARS 2003, London, UK, June 2003, pp 1354ff. Khaled W., Bruhns O., Reichling S., Boese. H., Baumann M., Monkman G. J., Egersdoerfer S ., Meier A., Klein D., Freimuth H., and Ermert H. - A haptic system for virtual reality applications based on ultrasound elastography-, 27th International Acoustical Imaging Symposium, Saarbruecken, Germany, March 2003, in press. Khaled W., Bruhns O., Reichling S., Boese. H., Baumann M., Monkman G. J., Egersdoerfer S., Meier A,, Klein D., Freimuth H., and Ermert H. -A New Haptic Sensor Actuator System for Virtual Reality Applications in Medicine-, Medical Image Computing and Computer Assisted Intervention, Miccat 2003, Toronto, Canada, in press. Monkman G.J., Bose H., Ermert H., Khaled W., Klein D., Freimuth H., Baumann M., Egersdorfer S., Bruhns O.T., Meier A,, Raja K.: Smart Fluid Based Haptic System for Telemedicine. 7th International Conf. on the Medical Aspects of Telemedicine, Regensburg, Germany, (2002), pp. 59-70 Pesavento A,, Lorenz A., Ermert H.: System for real-time elastography. Electronics Letters, Vol. 35, no. 11, pp. 941-942, 1999. Rychagov M.N., Khaled W., Reichling S., Bruhns O., Ermert H.: Numerical modeling and experimental investigation of biomedical elastographic problem by using plain strain state model // Fortschritte der Akustik. DAGA 2003. Proceedings. Aachen (Germany), March 2003. pp. 586 - 589. Sumi C. Suzuki A. and Nakayama K; "Estimation of shear modulus distribution of soft tissue from strain distribution"; IEEE Transactions on Biomedical Engineering, VOI 2 (1995); pp.193-202 Bose. H, Monkman G. J., Freimuth H., Klein D., Emert H., Baumann M., Egersdorfcr S., Khaled W., Bruhns 0. T. - ER Fluid Based Haptic System for Virtual Reality - 8th Intemtional Conf. on New Actuators, Bremen, (2002), pp. 351354. [ 101 Monkman G. J., Meier A. and Egersdorfer S. - Opto-isolated High voltage control array- UWEuropean patent pending (filed March 2003)
IN VIVO STUDY OF FORCES DURING NEEDLE INSERTIONS
B. MAURIN*, L. BARBE*, B. BAYLE*, P. ZANNE*, J.GANGLOFF*, M. DE MATHELIN*, A . GANGI+, L. SOLER++ AND A. FORGIONE++ * LSIIT, U M R CNRS-ULP 7005, bd. S. Brant, 67400 Illkirch , France + Dkpartement de Radiologie B, H6pital Civil, 67091 Strasbourg, France
++ IRCAD, Hopital
Civil, 67100 Strasbourg, France
e-mail:
[email protected] r Percutaneous procedures are among the developing minimally invasive techniques to treat cancerous diseases of the digestive system. They require a very accurate targeting of the organs, achieved by the combination of tactile sensing and medical imaging. In this paper, we study the forces involved during in vdvo percutaneous procedures for the development of a force feedback needle insertion robotic system as well as the development of a realistic simulation device. The paper presents different conditions (manual and robotic insertions) and different organs (liver and kidney). Finally, we review some bio-mechanical models of the literature in the light of our measurements.
1. Introduction
1.1. Needle insertion procedures
Needle insertions are necessary in a wide range of traditional medical treatments like injections or punctures. They are also among the less invasive procedures in developing radiological and surgical techniques like, e.g., the treatment of cancer by radio-frequencies. An increasing number of percutaneous treatments should be expected in the coming years. During these operations, the feedback on the practitioner is both visual and haptic. The visual feedback is given by the position of the inserting point, the needle penetration and usually by intra-operative medical imaging (CT-scan, fluoroscopy, ultrasound). Force feedback is also of central importance. Indeed, this feeback is a great source of information for the practitioner : it may allow him to detect transitions between organs and cavities and even to identify the nature of the organs and the properties of the tissues. To allow for a precise positioning of the needle in the targeted anatomical structures, these treatments presently require a lengthly procedure with successive image taking and needle insertion steps. As the effectiveness of 41 5
41 6
the treatment is strongly dependent on the tip needle position, the practitioner's skill and experience as well as the number of per-operative image taking steps are critical for the outcome of the percutaneous procedure. For all these reasons a recent interest in robotic devices for needle insertions has ariseng15. Nevertheless, very few projects have yet taken into account the problem of force feedback. This is actually one of the main limitation of pioneer robotic systems. As we believe that the radiologist should remain the principal actor of the operations, such a robotic system should necessarily be tele-operated. Hence real-time force feedback should be necessary. We are presently involved in the design of a force feedback tele-operated robotic system dedicated to CT-guided needle insertions 7 . It will allow, at the same time, a good protection of the practitioner with respect to X-rays, together with visual and haptic feedback. The haptic interface will be also used for simulation and training. 1.2. Why is it necessary to study forces ? The nature of the interaction between the needle and the tissues makes the understanding of deformation, cutting and friction mechanisms quite difficult and deserves further detailed examinations. Different studies have been achieved dealing with the measurement and the modeling of forces involved in percutaneous needle insertions 3,8. They can be classified according to the nature of the bodies in which the needle is inserted: artificial phantoms3 or animal tissues '. In the first case the authors are specially interested in the correlation between forces and deformations. Such measurements mainly have an interest to build simulators. In the second case, measurements are more interesting from a clinical point of view as they allow to study the behavior of real tissues. Unfortunately, they deal with dead tissues, and in spite of good conditions of preservation8, their bio-mechanical properties might be altered. This is specially the case with organs well irrigated by blood, like the liver or the kidney. To our knowledge, very few and studies give in vivo measurements of forces in surgical none of them specifically deals with percutaneous procedures. The main originality of our paper is to characterize the efforts involved in in vivo percutaneous procedures. The experiments detailed in section 2 allow : i) to establish the ranges of forces involved in percutaneous procedures; ii) to analyze and model the temporal evolution of the forces during the insertions. These results are interesting for many reasons. First, they allow to define the design constraints for a robotic system. Then, they help to understand the sensibility of force perception in order to design an haptic interface or a realistic simulator.
417 2 . Experimental setup
We used a Nano-17 6 DOF force sensor from AT1 Automation Inc. together with a special handling tool we made to rigidly hold the needle. This device can either be mounted as a robotic arm end-effector or held by hand. Typical resolutions of the sensor are 0.0125 N along its axis and 0.0625 mN.m in torque. A dedicated software running on real time Linux 0s has been developed allowing synchronized acquisition of measures and robot control. 2.1. Methods
All experiments were led with the same 18 gauges (1.27 mm diameter), 15.24 cm biopsy needle with anesthetized pigs. Since the pig is alive, the breathing was artificially stopped for the period of insertion, so that this repetitive perturbation had no impact on the data. Two different methods were used for acquiring an vavo data: in the first case a proficient radiologist inserts a needle by manually holding the force device attached to the needle. The task executed by the radiologist was to plunge a needle 4 times into the targeted organs, at a constant velocity, constant depth and with as less shaking as possible. The depth was hard to estimate, approximatively from 30 to 50 mm. in the second case the needle insertion is done by a robot, with the assistance of a surgeon. The insertion results of the combination of a descent of 20 mm in the tissue at constant velocity of 15 mmls, then a 8 s. pause and finally the needle extraction until its initial position a t 15 mmls.
As we previously underlined, both these experiments serve as an original database, since very few such data exist. Manual measurements allow to characterize the forces that a radiologist is used to feel (see Fig. 1). This is of great interest for engineers as practitioners have difficulties to characterize the forces and torques they apply naturally during operations. Robot insertions also have several benefits (see Fig. 2). In particular, no human intervention is needed except control, hence the same ex- Figure ~~~~~l periment may be repeated several times. The in- Setup sertions parameters (depth, velocity) can be easily modified, which can help to model the tissues. The main inconvenience in both type of ex-
41 8
periment is that the insertions were carried out "blindly", as we had no imaging device at disposal. Hence no control on the localization and the depth was possible. From *, we know that the depth and speed of insertion are of importance. The lack of precise knowledge of the depth of insertion in the organs gave us some difficulties to respect this constraint and we based our insertions on a priori constant depth. We distinguish the access of organs in two different categories: in the first case the skin is cut superficially (it is a usual practice during percutaneous interventions) and we perforate fasciae, or connective tissues, and some muscles. These experiments are refered to "with skin" in the following. In the second case, all the anatomic layers are cut, thus allowing direct access to the organ. These experiments are refered to "direct access" in the following. Most experiments dealt with the liver but some measure- Figure 2, Robotic ments on kidneys and pancreas were led since they Setup are also prone to percutaneous procedures. 2.2. Results
2.2.1. Ranges of forces We first focused on the maximum exerted forces during insertions. As expected, it appears from statistics, that the most significant results are given by the force along the insertion axis (in other directions, common values are 2.10-3 N for forces and 6.10-3 N m for torques. As a result, these data are no longer considered in the following analysis. All the statistics obtained for the radiologist manual insertions are presented in Table 1. The figures for the robotic insertions are presented in Table 1. Manual insertions
Organs and method (# trials) Liver, with skin (10) Liver removal, with skin (lo) Liver, direct access (6) Liver Capsule, direct access (6) Liver removal, direct access (6) Kidney, direct access ( 5 ) Pancreas, direct access (5)
I
1
Maximum force ( N ) 3.73 2.33 0.7 0.23 0.3 0.74 0.83
Std. deviation ( N ) 0.59 0.32 0.29 0.04 0.28 0.54 0.28
Table 2. Comparing insertion with and without direct access shows that the skin and the muscles accentuate the forces by adding contact forces
419 Table 2.
Robotic insertions
Organs and method (# trials) 1 Maximuin force (N) Liver, with skin ( 6 ) 1 1.89 0.69 Liver removal, with skin (6) 0.59 Liver, direct access (6) 0.35 Liver Capsule, direct access (6) 0.17 Liver removal, direct access (6) Kidney, direct access (4) 1.22
Std. deviation (N) 0.36 0.28 0.17 0.12 0.06 0.34
and elasticity to the global resultant. The differences between manual and robotic insertions is mainly due to the uncontrolled speed and depth of the radiologist.
2.2.2. Evolution of longitudinal force Figure 3 provides a typical plot of an insertion into the liver by direct access. The plot can be split into 3 phases. The descent phase, is the 0.5
0.4
0.3
0.2
0.1
0
-0.1
-0.2
2
4
6 t (5)
10
12
Figure 3. Needle insertions with direct access to the liver
most interesting for force modeling: it corresponds to the succession of an exponential-like rise, a sharp rupture, and finally again a rising slope. It matches the phases felt by the radiologist while piercing the capsule of the organ. The maximum force value is finally reached when the movement
420
ceases. As a consequence of organ elasticity, this peak is rather proportional to the depth of penetration. The second phase corresponds to the relaxation of the tissues. Not surprisingly, even if the needle is motionless, we have a slow decreasing force applied along the axis of the needle. This is known as a repulsion force applied on the tip. It depends mainly on the kind of bevel the needle is made of. For our experiment a regular bevel was used and the liver relaxation converge to a constant repulsion force of 0.3 N in average. Finally, the last phase corresponds to the removal of the needle during which only friction forces are applied on the surface of the needle: after a fall to 0 N , we obtain an exponential-like plot as in the insertion phase. Figure 4 is a plot allowing the comparison between the manual and the robotic insertions in the case of the liver "with skin".
-1
-
-2 -3 ~nsertionphase _ I pelaxation phase_'_
4
2
, -!-
,
4
6
Extraction phase
1-
8
10
12
14
J
(S)
Figure 4. Radiologist and a robotic insertions with skin access to the liver.
2.3. Modeling of the Liver
To model the insertion into the liver, we compared two models fitted with our measurements. Simone and Okamuras model the forces by the sum of 3 components: stiffness, friction and cutting. We will only stress the stiffness force and friction force as the test procedure was not the same as the authors
42 1
did. The stiffness force is exemplified during the insertion phase, before the puncture in the capsule. A second order polynomial f(d) = a0 ald a2d2 that relates depth to force, is well suited for this curve. After data fitting, the following coefficients were obtained: a0 = 0 N , a1 = 0.002 N/mm and a2 = 0.0023 N/mm2. These values are very close to those exposed in '. The friction force is given by a Karnopp' model and we found a positive static friction of 0.121 mN/mm and a negative of -0.117 mN,Jmm. The dynamic friction coefficients were considered null (force proportional to distance). The positive damping coefficient was 3.2 mNs/mm2 where the negative was 1.6 mNs/mm2. The second model we tested is taken from Maure16 and based on the work of Fung4. The insertion has two phasis: before the capsule puncture and after. In each phase, the force is modeled as an exponential function of the depth : f(d) = (Fo b)ea(d-dn) b. With a NewtonRaphson optimization we found the parameter vector 0 = [a,b, do, Fo] before the puncture: 0 = [0.121,-0.098,11.45,0.2]; and after the puncture: e = [-o.o3i,i.7,i9.6il -3.391. In Fig. 5, the two models are drawn for comparison. For the first model we get a mean error f = 7.4 mN and a standard deviation cr = 0.69 mN2; for the second f = 0.79 m N , and cr = 0.31 mN2. As expected, the errors are low and the models are both of interest.
+ +
+
+
0.6
06
z
(I.3
02
d llo-1m)
*,,o-3lm1
Figure 5. Two models fitted with our data (left: Simone, right: Maurel)
3. Conclusion This paper presents data analysis to characterize the values of forces during in vivo needle insertions. It underlines both the necessary sensibility to render the organs transitions and the important forces required to pierce
422
the tissues. As an application, models are derived from those data. Futur work will include the measure of the depth insertion and eventually of the organs deformations. Another point to develop would be to get data during a real human intervention. This was prepared in our experimental setup as the mechanical parts of the sensor have the particularity to possibly be protected by a sterilized bag Acknowledgments The authors wish t o thank the Region Alsace Council and the CNRS ROBEA program for their financial contribution t o this project.
References 1. I. Brouwer, J. Ustin, L. Bentley, A. Sherman, N. Dhruv, and F. Tendick. Measuring in vivo animal soft tissue properties for haptic modeling in surgical simulation. In Medicine Meets Virtual Reality 2001, pages 69-74, 2001. 2. J. D. Brown, J. Rosen, Y. S. Kim, L. Chang, M. N. Sinanan, and B.Hannaford. In-vivo and in-situ compressive properties of porcine abdominal soft tissues. In Medicine Meets Virtual Reality 2001, pages 26-32, January 2003. 3. S. P. DiMaio and S. E. Salcudean. Needle insertion modelling and simulation. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation, pages 2098-2105, Washington DC, USA, May 2002. ICRA’02. 4. Y. C. F’ung. Biomechanics: mechanical properties of living tissues. SpringerVerlag, second edition edition, 1993. 5. J. Hong, T. Dohi, M. Hashizume, K. Konishi, and N. Hata. An ultrasounddriven needle-insertion robot for percutaneous cholecystostomy. Physics in Medicine and Biology, pages 441-455, 2004. IOP Publishing Ltd. 6. W. Maurel. 3 0 Modeling of the Human Upper Limb Including the Biomechanics of Joints, Muscles and Soft Tissues. PhD thesis, Laboratoire d’Infographie - Ecole Polytechnique Federale de Lausanne, 1999. 7. B. Maurin, 0. Piccin, B. Bayle, J. Gangloff, M. de Mathelin, P. Zanne, C. Doignon, A. Gangi, and L. Soler. A new robotic system for ct-guided percutaneous procedures with haptic feedback. In Proceedings of the 2004 Computer Assisted Radiology and Surgery Congress (to appear), Chicago, USA, June 2004. CARS’04. 8. C. Simone and A. M. Okamura. Modeling of needle insertion forces for robotassisted percutaneous therapy. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation, pages 2085-2091, Washington, DC, USA, May 2002. ICRA’O2. 9. D. Stoianovici, L. L. Whitcomb, J. H. Anderson, R. H. Taylor, and L. R. Kavoussi. A modular surgical robotic system for image guided percutaneous procedures. In Proceedings of the 1998 International Conference on Medical Image Computing and Computer-Assisted Intervention, Cambridge, MA, October 1998. MICCAI’98.
TEACHING BONE DFULLING: 3D GRAPHICAL A N D HAPTIC SIMULATION OF A BONE DRILLING OPERATION
H. ESEN AND M. BUSS Technische Universitat Munchen, Institute of Automatic Control Engineering 0-80290 Munich: Germany Email: hasan.
[email protected]. de; Martin.
[email protected]
K. YANO Toyohashi University of Technology, Dept. of Production Systems Engineering Hibarigaoka 1-1: Tempaku: 441-8580 Toyohashi: Japan Email:
[email protected] In this work, a prototype nbedical training system (MTS) for teaching bone drilling skill is proposed. The MTS contains a 3D graphical and a 3D haptic display. For teaching purposes, a control algorithm is developed and presented in this paper. Preliminary user tests’ results suggest that the proposed MTS is a powerful tool for training core skills of a bone drilling procedure
1. Introduction
Haptic devices have found a place in many different application areas, specifically in medical sciences. In comparison with the researches on soft tissue interaction and on developing simulators for minimally invasive surgery (MIS), there are rather few works concerning surgery for cutting, burring and drilling of bones, In this work a skill training system for bone drilling in a 3D virtual environment (VE) with haptic feedback is proposed. Apart from previous works in the ”bone drilling simulation” field, our work concentrates on teaching core skills of bone drilling processes to novice surgeon. For this purpase, the novel control algorithm developed by Esen et al. is improved for a 3 degree-of-freedom DOF haptic device and presented in this paper. The algorithm enables medical students and novice surgeons to train their surgical skill for drilling into bone in a 3D VE. Bone drilling is needed prior to many orthopedical operations, such as pin or screw insertion to the bone and it requires a high surgeon skill. It is possible to state the core skills for bone drilling procedures as: Recognizing the drilling end - point; applying constant, sufficient but non - excessive 11213,47516.
423
424
thrust force and feeding velocity. Using the developed control algorithm, these skills can be trained simultaneously. The concept of the training system is given in figure 1. The system has three main elements: A 3 DOF haptic display, a 3D visual display and a controller working behind the displays. The haptic device is introduced in section 2, whereas the visual display and the controller are discussed in sections 3 and 4, respectively.
Figure 1.
Concept of the bone drilling medical training system
2. Haptic Device: VISHARDJ
The haptic device VISHARD3 (see figure 1) has been developed at Technical University of Berlin, while all authors were in Berlin. Normally it has 3 active DOF, but there is a possibility to add 3 more passive DOF by an end-effector design. Preliminary user tests are performed by using only the active 3 DOF of the haptic device. The first two links of VISHARD3 are designed in SCARA configuration. While these two links provide the movement in 5 - y directions, the third rotational link changes the end-effector position in z direction. Each joint has a DC- electrical motor (MAXON RE40) together with an encoder and a harmonic drive (HFUC-2UH-17). The contact force between the finger of the operators and the manipulator is measured by a force - moment sensor (JR3), which is placed between the last link and the end-effector. For measurement data acquisition and to control the PWM amplifier, an 1/0 card developed in our laboratory is used. Some important technical data of VISHARD3 is given in table 1. A detailed dynamical analyses of VISHARD3 can be found in, '.
425 Table 1. Technical data of VISHARD3. Active DOF
3
Passive DOF
3
Workspace (Width
X Depth X Height)
Position Resolution
60X40X35 cm 0.034 mm
Max. Force
87 N
Max. Velocity
0.68 m/s
Max. Acceleration
10,07 m/s2
3. Visual Displays
Two different Graphical User Interfaces (GUIs) are developed as visual displays. The first one represents the "Simulator Mode" (S-Mode), whereas the second one is used in "Teaching Mode" (T-Mode). Basically, T-Mode GUI is an improved version of the S-Mode GUI. SMode contains a 3D box model and a 3D drill model. The box model acts as a virtual bone which is drilled by 3D virtual drill model. When the trainee moves the end-effector of the haptic display, the medical drill model in the GUI moves as well. If there is a collision between the drill model and the box, the trainee feels it from the haptic display. T-Mode comprises additionally several indicators which inform the trainee about his/her trial on-line. Such indicators are not available for surgeons during a real bone drilling operation, therefore the S-Mode simulates a real bone drilling surgery, whereas T-Mode firstly aims to accelerate learning time of a novice surgeon for drilling into a bone. The available indicators in the T-Mode are: Force indicator, drilling velocity bar graph, drilling acceleration bar graph and drilling end-position warning signal. Just the latter one is also in use in the case of S-Mode, because there are similar navigation systems warning surgeon about the drilling position during a real bone drilling operation. The GUIs are programmed in C++ with Open GL in a Linux environment.
4. Controller Algorithm The control algorithm explained by Esen et aL7 is improved for applications in 3D space. This novel control algorithm enables the user t o feel the constant drilling velocity, when a correct drilling force is applied. Together
426
Controller
Figure 2.
VISHARD3
Simplified controller block diagram
with GUIs, the controller also provides with the end-position feeling. If the user exceeds the end-position, he/she feels that the bone (the virtual box model) is drilled through. The simplified block diagram of the controller is given in the figure 2. From VISHARDS, position qm of the end-effector and the force F , applied to end-effector are measured. In the VE, depending on F,, q, the predefined minimum fimin and maximum fimas applicable thrust forces, collision is checked and the reference position vector qref is defined. Then a position controller is implemented using Real-Time Matlab and Simulink models. In our work just drilling in z direction is possible. Therefore checking the collision is quite simple, i.e. the z coordinate of the upper surface of the box and the z coordinate of the drill tip has to be compared. If collision occurs, then drilling into box model with a constant velocity is possible only if the applied force is correct. The reference velocity signal in z direction n,,.,f is:
Integrating the equation 1, the reference position signal for z- direction p,,,f is obtained as p,,,f = J,,v,,,fdt. The reference signals in z and y directions are defined directly as positions. If the collision occurs at time point t ~then , the reference signals are: p,,,f = z ( t l ) and p,,,f = y(tl), where z ( t )and y(t) are measured drill tip positions. Note that when there is no collision, the controllers are switched off, i.e., zero volt is sent to the motors of the robot. For this phase of the project, we do not apply any free space controller. In figure 3, created reference signal positions versus time are shown. It is assumed that the collision occurs at time t,, the user starts drilling at time t,, applies always the correct force and finishes drilling at time t f . In
427
Figure 3.
z direction a
Created reference signals.
PD- position control is applied, whereas for x and y directions
just a P-controller is used. The reason lies behind how the reference signals are created. In x and y directions the change in the error is like a step function, which causes an impulse force at the step time. The step time is the collision time t,. 5. Controller Performance
For the experiments the following numerical values apply: fzmin = 40 N, fimaz = 45 N, v, = 0.1 cm/s, drilling start point= 0.4 m, drilling finishing zone- 0.255 m - 0.245 m. It is a known fact that depending on the bone to drill and drill tip in use, the necessary thrust force drilling into bone varies between 5 N and 200 N. Especially, inexperienced doctors need to train for applying big forces properly. Because of the limited applicable force capacity of VISHARDS, maximum applicable force is defined as 45 N, providing with a force range of 42 N in case of applying excessive forces (between 45 N and 87 N). In figure 4(a), drilling position versus time is depicted, in comparison with position error e, in z direction. The finishing zone of drilling is also shown by dotted lines. Figure 4(b) shows the applied thrust force versus time. Minimum and maximum applicable force are indicated by dotted lines. In figures 4(c,d) and 4(e,f), the results of position control in 5 and y directions are shown respectively. The figures indicate that after collision is detected and the drilling process started, i.e. drill bit is in the virtual box model, it is not possible to move the drill along neither x nor y directions, although the trainee applies force in those directions. 6. User Experiments
After integrating the main components, namely haptic device, GUIs and controllers, a prototype for a bone drilling MTS in 3 DOF environment is
428
(b) z- force [N]
4
O
v
l
0 .:
0
(c) x- position [m]
(e) y- position [m]
(d) x- force [N]
(f) y- force [N]
1
O
:
i
F
I
-1 00 10 20 Time [s]
Figure 4.
30
1
o
;
~
-100
0
10 20 Time [s]
Controller performance in
30
I
0
10 20 Time [s]
30
and y directions.
obtained. In this section, it is investigated how well the developed MTS improves the skill of trainees. For the experiments, applicable force limits and reference velocity are set to the same values that were given in section 5. In each experiment set, there are two participants. The first participant is an ”expert user” of the bone drilling MTS and he/she represents an experienced, high-skilled surgeon. The second one is a trainee who has not been introduced to the MTS before and has no experience in bone drilling. He/she represents an inexperienced medical student who needs to learn how to drill into a bone. The data of applied force during a trial on MTS is used to evaluate how successful the training is. For the bone drilling MTS it is accepted that 90% or more correct force percentage represents a ”skillful trainee”. Another important issue for the evaluation is the drilling end-position. If a trainee misses the point that he/she has to stop drilling, the trial will be counted as a failure, i.e. he/she will get 0% success. The trainee has 30 s to finish the bone drilling task. An experiment set has three steps: Step 1: ”Beginning Trials” The task is explained to trainee then he/she is asked to make a trial. Then an example drilling procedure is realized by the expert user, allowing the trainee to observe what a correct drilling should be like. After that the trainee is asked to make another trial. Step 2: ”Training” Only in this step, the trainee is allowed to use TMode. Depending on the wish of the trainee, the GUI is switched between S-Mode and T-Mode. The trainee should keep on training till the skillf211
I
429
140
120 100
E 9
80 60 40
20
0 -20 0
Figure 5.
20
10
Number of Tnals
Time
30
[s]
Success through out trials
trainee level (STL) is reached and he/she feels confident about reaching to this level. Step 3: "Checking" The trainee is asked to make 3 trials on MTS in S-Mode and it is expected that he/she will reach to STL in all of these trials. The experiment is finished up, when the trainee completes step 3 successfully. The saved measurement data is presented to the trainee after each trial depending on his/her wish. This is a very useful way for a trainee to understand his/her mistakes. 7. Experimental Results and Discussion In figure 5 the success percentage of a trainee versus the number of trials is sketched. STL is emphasized by a bold line a t 90%. The improvement in the drilling skill by repeated trials is clearly visible. The last 3 trials' success percentages are 92%, 90% and 92%. Too see more clearly whether the trainee gained the feeling of applying the correct force or not, two bad trials are compared with each other. The first bad trial B1 is chosen from the "first trials" period and the second one B2 from "last trials" period. The comparison of applied forces in trials B1 and B2 is given in figure 5b. It is obvious that in trial B2, the trainee has much better force feeling. The amplitude of force oscillations around the correct force range are smaller with respect to trial B1. Though the timing is not good, he/she manages to get into correct force range in B2. That was not the case for B1. The trial S1 is from last trials and an example of correct drilling. In figure 6, the drilled lengths during trials B1, B2 and S1 are compared. In addition to the improvement in the applied force, it is possible to see in this figure that timing of the trainee gets better. In the trial S1, the trainee applies correct force with a good timing, keep the
430
0.2
I 0
20
10
30
k l R (S]
Figure 6. Comparison of the worst trial of the early training period and the worst trial of the late training period.
velocity constant and finishes the drilling in 16 seconds, whereas in trial B2 although the trainee finishes the drilling job, the timing is bad, namely around 30 s. In early trainings, the trainee can not finish the drilling job at aU. The trainee, whose results are taken above, managed to reach the STL in 37 trials in S-Mode. After his second trial, he/she used T-Mode 21 times sequentially. That helped him/her to have a first idea about his/her applied force and velocity. He/she did his/her other trials afterwards only in SMode. It is investigated that, because of the dominance of visual feedback, the more often trainee uses the T-Mode, the more it causes his/her loss of concentration on haptic feedback. To keep the concentration of the trainee high, a break is given whenever the trainee wishes. In the case of the trainee above, after the 22nd SMode trial, a break is given until next morning. When he/she was back in experiments, the STL is reached in 15 trials. 8. Conclusion
In this work a medical training system (MTS) for teaching bone drilling is proposed. For this purpose a control algorithm is developed and applied. Controller performance is satisfying. Using the 3 DOF haptic device, the trainees are allowed to move a drill bit model in a VE and drill a virtual box. They are asked to keep the applied thrust force and thrust velocity constant for being able to drill into the virtual box model. This is the requirement for a successful drilling into bone in a real medical operation. Preliminary user studies suggest that the proposed system can be a powerful tool for teaching how to drill into bone. It is also indicated that some additional visual indicators are helpful for teaching. They are good for giving the first idea about the correctness of applied force and velocity, but after a while they cause the trainee to h e his/her concentration on haptic feedback.
43 1 References
1. R.. Schmidt. Spinal tap: An architecture for real-time vertebrae drilling simulation. Technical report, Department of Computer Sciences at the University of Calgary, 2002. 2. Agus M., Giachetti A., Gobbetti E., Zanetti G., and Zorcolo A. R.ealtime haptic and visual simulation of bone dissection. Presence, 12(1):110 - 124, 2003. 3. Petersik A,, Pflesser B., Tiede U., Hhne K. H., and Leuwer R.. R.ealistic haptic volume interaction for petrous bone surgery simulation. I n CARS Proceedings, Paris, F'rance, 2002. 4. Galanos P. and Karalis T. Research on the mechanical impedance of human bone by a drilling test. J. Biomechunics, 15(8):561 - 581, 1982. 5. Wiggins K.L. and Malkin S. Drilling of bone. J . Biomechanics, 9:553 - 559, 1976. 6. Udiljak T., Ciglar D., and Mihoci K. Influencing parameters in bone drilling. In gh International Scientific Conference on Production Engineering CIM, pages 133 - 142, Lumbarda, 2003. 7. Esen H., Yano K., and Buss M. A control algorithm and preliminary user studies for a bone drilling medical training system. In Proc. of the RO-MAN, San Francisco, USA, 2003. 8. F'ritschi M. Aufbau eines trainingssimulators fi: die digitale rektale untersuchung von prostatakarzinomen. Master's thesis, Institut fur Flugmechanik und Flugregelung, Universitat Stuttgart, 2002.
MICROMACHINED SILICON 2-AXIS FORCE SENSOR FOR TELEOPERATED SURGERY FREDERICK VAN MEER, DANIEL ESTEVE, ALAIN GIRAUD AND ANNE MARIE CUE LAAS-CNRS, 7, A v du Colonel Roche 31077 Toulouse cedex 4. France
In this papcr is prcscntcd a new typc of 2-dimcnsional mcchanical scnsor dcsigncd to mcasurc thrust and pull constraints and for a dircct intcgration inside of surgical instrumcnt jaws. Thc architccturc of this sensor is closc to micro-accclcromctcrs using capacitivc dctcction. Thc first prototypc was mountcd in a 5mm forceps jaws with thc possibility to mcasurc macro-forccs in two dircctions. The dcsign of this scnsor is includcd in a French national laparoscopic robotic projcct namcd EndoXiroB , In this papcr wc dcscribc the EndoXiroB projcct and the intcrcst to use microtcchnologics for surgical tools, followcd with the presentation of thc structurc and the making of thc force scnsor. Wc will thcn detail the simulation and charactcrisation of its mcchanical aspccts, togcthcr with the calibration method.
1.
Introduction
Teleoperated surgery has become an important research topic within the medical field due to crucial breakthroughs. More recently several laboratories and companies (SINTERS SA-Industrial Leader, LAASICNRS, CEA-CEREM, LIRMM, INRIA-Chir, ONERA-CERT, CHU-Toulouse, IET, SIQUALIS) have started in partnership the EndoXiroB project. The actual robots for laparoscopic surgery already existing on the market have allowed for consistent progress in medical robotics. Nevertheless, these systems are very heavy, not well suited for operating rooms, expensive and without real haptic interface. The EndoXiroB project is aimed at the development of a system for the new generation of operating rooms and the design of smart surgical tools using the last advances in microtechnologies. The architecture of the slave-arm was designed by the LAASKNRS and the industrial prototype was assembled by the SINTERS-SA company. The principle of the robot, patented by these two actors, consists in a double parallelogram structure in order to create a fixed point for the trocart insertion [I]. The 120" articulation cone of the robot arm is generated by the movement of two plates fixed on a main motor. The robot is composed by several aluminium structures and carbon tubes which provide really good rigidity and movement accuracy.(Cf. Figure 1) * The EndoXiroB project is supported by a RNTS grant, you can obtain more
information on http://www.endoxirob.com.
432
433
Figure 1: The first industrial prototype of the EndoXiroB project dcsigncd the LAASKNRS and SINTERS SA.
The other aspect of the EndoXiroB project is the design of the laparoscopic instrumentation. The design of the end effectors uses the last microtechnologic advances in order to create very small and intelligent systems. Indeed, thanks to microelectronics technologies it now is possible to realize microsystems including sensors/actuactors and their electronics in the volume of a 5mm forceps. The main benefit to use microtechnolgies in this project is the possibility to integrate directly inside the surgical tool a sensor to evaluate forces applied on tissues during surgery. C.R. Wagner and al. have demonstrated improved performances and safety of surgical acts with force feedback during the Blunt dissection [2]. The use of force feedback allows to decrease the force magnitude peak and average and the number of tissue damages, without decreasing the precision of the surgical act. In our case, the difficulty was to design a force sensor to be integrated inside the surgical tool. Current sensors used in studies on force feedback for the robotic surgery are generally too big for direct measurement forces apply on the tissues. Because of problems arising from mechanical friction and sterility issues, it was impossible to use an industrial force sensor on the tube of the instrument. In this case, the choice was to develop a thin silicon planar sensor in order to evaluate the thrust and pull forces. Indeed, several works on tactile sensors for surgical application [3] already exist and it seemed interesting to develop a very small sensor integrated inside the forceps or scissor jaws to measure thrust and pull constraints.
434
2.
New design concept for medical instrumentation
An interesting concept, patented by LAAS/CNRS and SINTERS with 2D force sensor, consists in the design of the MEMSt as an actor of the surgery [4]. All microsystem techniques (silicon etching, plastic metal deposition, moulding.. .) are used to design the complete end effectors. We can name these new devices as “Surgical chip” (Cf. Figure 2) . Created by Microsyste
Figure 2: LAAYCNRS concept for new generation of surgical instruments. The Chip is dcdicated for surgical applications and dcsigncd thanks to microelectronic proccsscs.
Indeed, the idea is to consider the MEMS as a microelectronic component dedicated to the surgery. It is placed on a generic articulated support (a metal sheet bending) in the same way as electronic components are placed on electronic cards. This microsystem is not necessarily composed of electronic sensors or actuators. It can be a classical tool designed by microelectronic process in order to have very small and precise structures. In order to demonstrate the application of this concept, We have developed two kinds of instrument using microtechnolgy techniques. The first model is an articulated plastic/silicon scissor (Cf. Figure 3).
Figure 3: Plastic scissor with wet etching silicon blades, LAASKNRS [3].
MEMS: Mechanical Electronical MicroSystems; Microelectronic devices including several functionalities as Electronics, Fluidics, Mechanics,Optics.. .
435 The support of the scissor is designed by water cutting and sticking. Blades are wet etched with KOH to have tilted faces (54.7"). This microtechnologic process gives the possibility to create 200 blades on a single 4" wafer. Silicon is really interesting for tissue cutting because it is easy to micromachine this material to have several identical sharp and smooth blades with the possibility to incorporate microsensors (Temperature, Ultrasound for micro-echography, Forces.. .). These first blades are in a basic form at the moment, but the next patented version will be equipped with force sensors and electrosurgery electrodes inside the silicon blades. The second model is a 8mm plastic/metal forceps (Cf. figure 4). The aim of this design is to evaluate the following: The difficulties in wire integration in a small articulated instrument and a method to reduce the end effectors costs. The tool designed is composed of an articulated structure in metal bending, plastic jaws with four light emitters (blue and yellow) and four strain gauges.
Figure 4: Articulated tool in metal sheet bended with light emitters and strain gauges, LAASKNRS.
The model has 3 degrees of freedom, 6 mechanical cables to control it and 10 electrical cables connected to the jaws for lights and sensors. The tool was tested continuously by the activation of the joints to determine the lifetime of the system. After 15 hours the first electrical cable broke and the first mechanical cable after 25 hours. All materials, used in this instrument, are biocompatible, can be sterilised and allows to build an electronic end effectors for around 20 euros. This short presentation gives an idea of the possible advances which can be provided by microtechnologies in the field of surgical instrumentation. Processes of microelectronics allow to micromachine silicon structures with an accuracy inferior to micrometer and combine in the same chip different fimctionalities: sensors, actuators, signal treatment. We tried to use these advantages to design our 5mm surgical forceps with a 2D sensor for measurement of the thrust and pull forces.
436
3.
Realisation of the 2D sensor for surgical applications
3.1. Constraints of the design Due to the softness of live tissues, the sensor was placed inside the tool jaws, its size being smaller than a volume of 3.5x0.7xlOmm3. Tissue manipulation using force feedback induces forces in the range 0.1N to 2N with a sensitivity approximately of the value 0.1N. In spite of this large range and the small volume, we decided to build the whole sensor in Silicon. The first tests on the deformable structure of the sensor (micromachined in a 0.5 mm thickness wafer) showed that is possible to measure constraint from ON to 3.5N without breaking the silicon. As the micromachining of OSmm of silicon is long , we preferred to use a 0.325 mm thickness wafer to obtain a sensor working in the range ON to 1.7N. The last critical point of this sensor is the detection of the force variations. Indeed, forces applied on the sensor induces a displacement of the deformable structure inferior to 0.005 mm for a force of 1.7N. The sensor we have developed uses a capacitive detection and this small displacement modifies the capacitor values by a very small amount. Therefore, we had to integrate a chip specialized in capacitor measurements close to the sensor. 3.2. Sensor structure and technological steps of the fabrication process The architecture of this sensor is close to micro-accelerometers using capacitive detection but with a geometry allowing macro-forces measurements. The sensor measures 2D forces in the direction of the surgical tool surface. We chose to develop the assembling in the manner of a double wafer using Flip Chip (FC). On the first layer of the sensor, four main springs are attached to a big platform, which is connected by bonding to the second layer thanks to FC (Cf. figure 5).
Figure 5: Two layers with complementary clcctrodes arc bonded by flip chip. If a strain is applicd on the sensor, the second layer moves and deforms the springs. Thc output capacity between elcctrodes varies according to the rigidity of the springs. The four springs are dcsigned to resist when we apply a 1.7N force on the mobile area.
437 Applied forces are then estimated with several comb-shaped electrodes located on the first and second layers. A strain applied on the second layer yields a deformation of the main springs. Consequently, the electrodes of the second layer move, inducing a change in the capacities relative to orthogonal strain directions between the two layers. Given the springs rigidity is well known, it is possible to estimate the force vector in two orthogonal directions in the wafer plane. The deformable structure was determined as a squared shape surrounded of branches at right angles, large (1.6x1.6mm) to support shear constraints afters the bonding. This silicon structure is really compact with a high resistance before the break its size. We used a finite elements simulation to define the geometry of the sensor branches: 0.150~0.325~0.69mm sized branches allow to support a 1.7N maximum force. Both layers of the sensor are obtained with the process described by the drawing on figure 6. The deep etching of the first layer which makes up the springs is really difficult to achieve as the trench micromachined in the Silicon must be very thin. This thickness induces a 3h30 etching time to produce the branches correctly for 0.03 mm trenches. If we reduce the size of the trenches the etching time increases highly and it is really difficult to obtain the release of the springs. The four electrodes of the sensor are obtained by titanium and gold deposition on silicon dioxide and finally both chips are bonding with a conductive silver glue.
Figure 6: The both chips of the sensor and the description of structure after the bonding with a silver glue for the electrical conductivity between the two layers, LAASKNRS
3.3. Mechanical simulations and experimental results We designed the spring structure with finite elements simulations to evaluate the displacement of the branches when forces are applied. These simulations allow to know the optimal geometry of the capacitive electrodes when we use
438
0. 150x0.325x0.69mm3 branches. The graphics on figure 7 presents the difference between the simulations and the experimental results for several values of forces applied on the deformable body of the sensor. Displacement of the springs vs force
Figure 7: The finite elements simulation and the experimental data give the samc result on the displacement of the springs. The break appears after 1.7N force, which represents a stress value in the silicon of 300Mpa. (ESi=130Gpa, p=0.064)
The experimental results obtained with an optical profilometer are linear and really close of the simulation, which is really interesting to design the most suited structure for tissue or needle manipulation. 3.4. Characterisation of the sensor
In order to calibrate our sensors we have develop a rotating stage allowing the measurement of the four values of the sensor capacitor. This system can apply a continuous force or displacement on the sensor and stores the output signals variations, measured by a specialised chip from Irvine Sensor Inc. (MS3 110). The Irvine sensor chip is interesting because we can find it in die version for a direct integration inside the jaws of the surgical tools. The next graphic shows the values of the four sensors from 0 to 180" for 0.5N force. These results are very interesting as they show a problem due to the thickness of silver glue. 2
5
A': ?
O
i..": '
2 25
. .. .-. . .
.
. ..
. .. .. . .
.
Figure 8: The MEB image show the 6pm space existing between the two layers. The curves is obtained when we applied a 0.5N force on the sensor. The data show a competition between the layer distance and the electrodes overlap.
439
Indeed, theoretically the two layers of the sensor must be in contact thanks to the four blocks, but the thickness of the glue creates an unexpected large space between the layers and the sensor measures essentially the titling of the second layer. In fact, the size of the silver particles are to big and it impossible to obtain a very thin glue film. As we show on the picture of the figures 8, the distance between both layers after the Flip Chip is around of 6pm. The first curves obtained are repeatable and proportional with the forces applied in the direction of the sensor surface. These first results are really encouraging, nevertheless, we must modify the second layer in order to create enough space for the glue and to obtain contact between the two chips.
4.
Conclusions
This first realisation allowed to solve several problems and will be used to improve the next version of the sensor. Indeed, the micromachining of the silicon is well controlled and the knowledge of the bonding step will help to modify effectively the geometry of the second layer. The bending strength of the spring branches is really good as we obtained a large range of displacement and macro-force magnitude with the silicon material. The chip used for the capacitance detection is really adapted to our sensor and will yield the design of a complete microsystem for an integration inside a surgical tool. Several tests on different sensors showed the same shapes of curve with a good repeatability and linearity in the measures. In this preliminary study we will most probably make use a Neural Network to estimate, with the database of one sensor, a model to evaluate the force magnitude and direction with the four capacity value variations. The next version of the sensor will have the second layer designed in Si02 as we noticed some parasite capacitors (40pF) between the two layers if they are both in silicon. We can also design the sensor with curved electrodes to eliminates the possible torsion effects.
440
References A. Giraud, D. Esteve, F. Van Meer, P. Caron, LAASICNRS, SlNTERS SA, French Patent - FR0,213,169 : Robot chirurgical d’orientation et de positionnement d’un instrument chirurgical porteur d’un outil terminal chirurgical, (2002). C.R. Wagner, N. Stylopoulos, R. D. Howe,The Role Of Force Feedback In Surgery:Analysis Of Blunt Dissection, Tenth Simposium on Haptic Interfaces for Virtual Environment and Teleoperator systems, (2002). J. Arghai, M. Parameswaran, S. Payandeh, A micromachined piezoelectric tactile sensor for endoscopic grasper: theory, fabrication and experiments, Journal of Micro-Electromechanical Systems, Vol. 9, Number 3, pp. 329336, (2000). D. Esteve, F. Van Meer, A. Giraud, N. Villeroy, LAASICNRS, SlNTERS SA, French Patent - FR0,2 13,168: Outil Terminal d’instrument chirurgical, (2002).
IMPROVEMENT OF COMPUTER- AND ROBOTASSISTED SURGERY AT THE LATERAL SKULL BASE BY SENSORY FEEDBACK DIRK MALTHAN', JAN STALLKAMP', STEFAN WOESSNER', ERWIN SCHWADERER', FLORIAN DAMMANN', MARCUS M. MAASSEN3 'Fraunhofer-Institute for Manufacturing Engineering and Automation IPA, NobelstraJe 12, 70569 Stuttgart, Germany rnalthan@ipa.&g.de. +49 (0) 711/970-1862 Department of Diagnostic Radiology, University Hospital of Tiibingen Department of Otolaryngology-Head & Neck Surgery. Universiy Hospital of Tiibingen
This work was supported by grant No. M A 145812 of the German Rcscarch Foundation (DFG). Lately, surgeons performing lateral skull base surgery have been paying more attention to the development and advancement of computer-aided surgical procedures.. Especially the interconnection between electronic hearing implants (i.e. MET Otologics Transducer) and receptor tissues requires an accurate milling of skull cavities to ensure a proper angle of coupling and a long term stable fixation. We developed a PC-based computer- and robot assisted system for the precise and efficient planning and execution of surgical processes at the lateral skull. Employing multimodal sensory feedback, the geometry of the surgical area can be analyzed with a resolution of several pm. Laboratory testing demonstrated that thereby robot-assisted skull base surgery could be improved. Both the reliability and the efficiency of the performed surgical process steps were augmented. In addition, the acquired data could be used to implement an exhaustive intraoperative procedural documentation needed for quality assurance and quality management tasks.
1.
Introduction
In otosurgery, one of the most challenging surgical procedures is the implantation of electronic hearing devices, requiring a highly reproducible and accurate fixation technique. Here, the interconnection between the amplifier's vibratory actor and the receptor tissue, and the fixation of so-called mountain brackets, requires an accurate milling of the required skull cavities [ 11. In order to avoid a distortion of the transmitted acoustical signals, the implantation angle has to be defined very carefully (Figure 1). Employing conventional techniques, the anatomical connection between the hearing aid and the receptor tissue can hardly be optimized.. These difficulties may result in a disproportional resection of bone tissue and a consequent loosening of the implant. Computer assisted surgical systems are expected to provide an essential improvement of both efficiency and precision for interventions at the lateral skull 441
442
base in the nearer future [2]. Within a funded joint project, a computer- and robot assisted surgical system has been developed, meeting the requirements of the implantations at the lateral skull base by the integration of sensory feedback. Already the first milling experiments of the Department of Otolaryngology, Head and Neck Surgery at University Hospital of Tubingen together with Fraunhofer IPA pointed out the potentials of robot assisted milling for an improvement of precision and efficiency for operations at the lateral skull base [3]. In order to establish the complete surgical process chain however, an integrated planning and control system meeting the precision requirements of the intervention was necessary. Furthermore, the realized system had to guarantee a favourable relation between benefits and costs, e.g. a gain in quality and efficiency versus higher system expenses. This should be realized by a highly automated process control in combination with a simplified, straight-forward and process driven user interface, building a so-to-speak kind of “surgical assistant”.
Figure 1. The interconnection of the MET Otologics Middle Ear Transduccr with the ossicular chain requires a highly precise cavity milling.
2.
Planning- and Control System
We developed a software system for the precise planning and carrying-out of computer- and robot-assisted interventions at the lateral skull base. The development efforts are based upon a PC-based platform, utilizing the available libraries “Visualization Toolkit” (VTK) [4], “Insight Segmentation and Registration Toolkit” (ITK) [ 5 ] and “Apache XML-Parser” (Xerces) [6]. In order to realize a both flexible and dependable CASIRAS- system, a modular and distributed control- and communication architecture was realized (Figure 2). It consists of several modules, being classified into the so-called hardware-, control- and interaction-layer. All modules are physically independent executable software programs. Thus, they all have the ability to supervise
443
the correct function of the other modules and are not affected by eventually occurring malfunctions of associated software components. This way, a redundant software system is build, as described also in other critical application scenarios [7]. The communication between the modules bases upon ,,Industrial Ethernet“, a real-time variant of the well-known TCP/IP-protocol. The command sequences are transmitted utilizing the XML (‘‘extended Markup Language”)standard, being a flexible and expandable method of message transfer. Furthermore, employing XML, the transferred message syntax can be checked automatically in order to avoid transfer errors. The interconnection of the robot system is realized in a bi-directional way. Whereas control sequences and parameter messages are transferred to the hardware-layer, a throughout feedback of the robot position and the status of the hardware components are realized. Additionally, sensory information like the force impacts on the tool tip and the information of the optical sensor systems are looped back to the higher level algorithms.
3. Hardware Set-up In order to evaluate the developed planning- and control system for computer- and robot-assisted interventions at the lateral skull base, a laboratory test scenario was realized. The central element of the hardware setup is built by a modified industrial robot system “Adept Six-300‘‘ ( Adept Technology GmbH, Dortmund, Germany). The employed robot systems distinguishes of its articulated kinematics six degrees of freedom (6-DOF) allowing for flexible preparation angles. It is equipped with a 6-DOF force- and momentum sensor system “Adept Force“ (Adept Technology GmbH, Dortmund, Germany) so that any impact on the robot’s tool tip can be recorded and supervised. In addition to the mechanical sensor systems, two different optical sensor system have been utilized. A laser-based triangulation distance sensor ,,LT 100‘‘ (ElcoTec GmbH, Dortmund, Germany) has been integrated for the precise identification of the in-situ geometry. It consists of a laser source being reflected by the patient’s surface and a CCD-camera recording the reflection angle. This way, in combination with the robot’s kinematics, a laser scanner has been realized, allowing a determination of the patient’s topology with a precision better than 10 pm in every direction. For the differentiation of tissues, an auto fluorescence sensor system “S2000” (Oceanoptics Europe B.V., Duiven, The Netherlands), has been integrated. The tissue region to be analysed is radiated with ultra-violet light, the resulting auto fluorescence spectra are recorded and automatically interpreted. The milling of the skull cavities is done by a conventional
444
bone drilling system ,,Elan EC“ (Aesculap AG & Co KG, Tuttlingen). This way, the results of the robot-assisted intervention are comparable to the manual preparation of the required skull cavities.
Figurc 3: Ovcrall set-up of thc realized systcm for robot-assistcd intcrvcntions with sensory fccdback at the lateral skull basc (Icft). Dctailcd view of scnsory tool tip.
The user interaction with the operating set-up is done employing a totally sterilizable touch-screen (“Videre”, Fraunhofer IPA), using a disposable infrared pen and a projected graphical user interface. The overall set-up of the laboratory system is shown in figure 3.
4.
Closed-loop Robot Control
The integration of sensory systems feeding back in-situ information allows for an employment of the robot system in a so-called closed loop control architecture (figure 4). Here, actual and nominal values are compared continuously, so that e.g. planned milling paths can be adjusted to intra operative changes. This results in an increased precision and safety of the milling procedure.
445
CIosPd Lmp Control
The surveillance of the affecting forces assures non-pathologic force impacts on the patient. The measured forces are recorded and transmitted to the control system. For the estimation of forces within a surgical step, algorithms were implemented, that allow a simulation of the force impacts. In case of unacceptable errors, the robot assisted surgical process is interrupted. Actual research efforts show that the analysis of recorded forces may help identifying anatomical structures e.g. dura mater [8]. These results may supplement the developed sensory system in the nearer future.
5. Topology Comparison and Registration In particular the transition of the computer-assisted planning of a surgical procedure to its execution is regarded as critical step. For this step a transformation of the planning coordinate system into that of the patient is necessary: the patient has to be registered. Within the developed system, the comparison of the nominal against the actual surface geometry using a laser scanner builds the basis for an intra-operative and continuous registration step. In contrast to existing surface-based registration techniques like ICP- (Iterative Closest-Points) algorithms, the developed algorithms do not match the whole surface in order to realize a registration. Rather are significant associated anatomical landmarks in the actual and nominal data set identified and assigned to one-another (figure 4). The technique is based upon a procedure called “spinimages” [9], was however modified in important pre- and post processing steps. The advantages include the robustness against malformed surface reconstructions, resulting e.g. from blood and rinsing liquids.
446
Subuptimal Registration
Optmat Regn(ratM1
Figure 4: Principle of the realized online-registration algorithm
The precision of the surface-based registration was evaluated by the measuring of artificial landmarks in form of registration metal beads. After the performed surface-based automatic registration the position of the registration beads was identified by the robot system. For the determination of the robustness of the procedure a test series with 50 experiments was performed, whereby the head model was deformed using polygon reduction and smoothing, and the addition of error noises [lo]. The results of these experiments are shown in figure 5. The registration was accomplished in 48 of the 50 cases with an absolute error smaller than +/-0.7 mm. In two cases the registration was however not accomplished correctly.
Figure 5: Evaluation of typical surface registration error using polygon smoothing, polygon reduction, partial addition of error noise, global addition of error noise (from left to the right).
Furthermore, within the milling process the planned and simulated surface topology of the skull cavity can be compared with the performed milling proc-
447
ess. Prior to the shift of the milling head, the in-situ geometry is evaluated in real-time. Only in case of negligible error values, the milling process is continued. In other cases, a re-registration of the patient is performed automatically.
6. Evaluation of Tissue Differentiation System Besides the geometrical and force based feedback of sensory information, methods for the intra-operative differentiation of tissues were employed. These were on one hand the measurement of infia-red reflections of tissue samples, on the other the analysis of auto fluorescence spectra of the test bodies. Here, experimental series were performed on animal tissue samples of pigs and mice, and on pure NADH (Nicotinamid-Adenin Dinucleotid-Hydroeen). NADH, which is very closely related to Niacin (vitamin B3), is also known as coenzyme 1 and is mainly involved in the cells' metabolism process. The reflection of light in the near infra red range showed an increased reflectivity of bone tissue against the reflectivity of soft (e.g. muscle) tissue. This allows a sensory differentiation of hard and soft tissues. Furthermore, it has been ascertained, that the auto fluorescence properties of bone against muscle or nervous tissue differ in significant details. lrradiated with ultra violet light, bone or cartilage tissue showed a significant auto fluorescence in the range between 430nm and 460nm [ 111. As muscle or nervous tissue do not show this characteristics, an automated interpretation of the auto fluorescence spectrum could be realized.
Figure 6: Differentiation of bone and muscle tissue by auto fluorescence (ax. = arbitrary unit)
448
These results allow for a further integration into the feedback loop in the further development period. Thus, a supervision of anatomical structures, e.g. nervus facialis or dura mater becomes possible. 7.
Conclusion
A computer- and robot assisted system for microsurgical high-precision interventions at the lateral skull base has been developed. The system employs different sensor systems, like force sensors, laser triangulation sensors and auto fluorescence sensors for tissue differentiation, in order to realize quality assurance and documentation steps. Already today it could be shown, that the feedback of sensory information improves the safety and the precision of robotassisted surgical procedures [ 121. Within the next project period, the described analysing algorithms will be optimized to work under clinical conditions. The robot system will be integrated into anexperimental operation room surrounding, and further clinical evaluation studies will be performed. References
1. Maassen M.M., Dammann F., Malthan D., Stallkamp J., Schwaderer E., Zenner H.P.: Development of a robot prototype with sensory feedback. 3rd Symposium of Middle Ear Mechanics in Research and Otology. Matsuyama, Ehime, Japan,9th- 12th July, 2003. 2. Dammann F., Schwaderer E., Seemann M.: Einsatz von Methoden der 3DVisualisierung im Kopf-Hals-Bereich. Fortschr Rontgenstr 2003; 175: 86. 3. Plinkert P.K., Plinkert B., Hiller A., Stallkamp J.: Einsatz eines Roboters an der lateralen Schadelbasis. Evaluation einer robotergesteuerten Mastoidektomie am anatomischen Praparat. HNO. 2001 Ju1;49(7):514-22. 4. http://www.vtk.org, 27.02.2004 5. http://www.itk.org, 27.02.2004 6. http://xml.apache.org/xerces-c/index.html,27.02.2004 7. Bowen, J., Stavridou V.: Safety-critical systems, formal methods and standards. Oxford University Technical Report PRG-TR-5-92, Software Engineering Journal, 178-183. 8. Federspil P.A.,Geisthoff U.W., Henrich D., Plinkert P.K.: Development of the First Force-Controlled Robot for Otoneurosurgery. Laryngoscope 2003 Mar;113(3):465-471. 9. Johnson, A.E.: Spin-Images: A Representation for 3-D Surface Matching. Carnegie Mellon University, Pittsbugh, Pennsylvania, August 1998 10.Malthan D., Stallkamp J., Wossner S., Dammann F., Zenner H.P., Maassen M.M.: Lasertriangulationsgesteuerte Sensorik und miniaturisierte Roboter-
449
kinematik konnen Implantationen an der lateralen Schadelbasis verbessem, 5. Oktober 2002, Leipzig, I1 Jahrestagung der Sektion Neuroendoskopie, Neuronavigation und intraoperative Bildgebung der Deutschen Gesellschaft fur Neurochirurgie. 11. Woessner, S., Huen, J., Malthan, D.: Differentiating tissue by fluorescence spectroscopy. In: Proceedings of SPIE Vol. 5261, Rhode Island, October 2003, In Print 12.Maassen M.M., Dammann F., Malthan D., Stallkamp J., Schwaderer E., Zenner H.P.: Erhohung der Prazision von computer- und roboterassistierten Operationen an der lateralen Schadelbasis durch sensorische Ruckkopplung. Jahrestagung der Deutschen Gesellschaft fur Hals-, Nasen-, Ohrenkrankheiten, Kopf- und Halschirurgie. Dresden, 28. Mai bis 1. Juni 2003.
A SURGICAL MECHATRONIC ASSISTANCE SYSTEM WITH HAPTIC INTERFACE* S. PIECK, I. GROSS, P. KNAPPE, J. WAHRBURG
Zentrumfu’r Sensorsysteme (ZESS),University of Siegen, Paul-Bonatz-Str. 9 - 11 57068 Siegen, Germany
The versatile navigated mechatronic system to assist in surgical interventions presented here, is characterized by the integration of an optical navigation system and a mechatronic arm. In contrast to known autonomous solutions, our highly interactive designed system facilitates the surgeon to take appropriate actions to the system‘s work at any time. The interactive faculties of the mechatronic system are achieved meanly by 1) optical position-control of the mechatronic arm and 2) a haptic interface to the mechatronic arm.Unique to this system, and resulting from I), is that the mechatronic arm keeps track to the patient’s position even during patient’s movements. This online tracking capability eliminates the need for rigid patient fixation. As this approach integrates a conventional navigation system, the surgeon can utilize it as accustomed for additional support, if desired. The haptic interface 2), using a force-torque-sensor, permits the surgeon to move the mechatronic arm from its home position to its working position. No sophisticated algorithms have to be implemented for collision avoidance. The surgeon guides the arm easily by grabbing a grip and by pulling it in the desired position. This integrates the mechatronic assistance system seamlessly in the operating procedure, because there is no need to use any input-device like a mouse, a touch screen or a keyboard.
1. Introduction
This contribution illustrates the actual status in the development of a new versatile surgical mechatronic assistance system. Our design has started from the analysis of existing solutions for computer assisted and image guided surgery [l]...[3][11]. Commercial systems may essentially be divided in two main groups consisting of passive navigation systems on the one hand and active robotic systems on the other hand. The clinical experiences with available products of both groups have turned out that they still have certain drawbacks and limitations. Most of them do not yet fulfil very important user requirements simultaneously, such as
.No significant extension of the operating time compared to pure manual surgery, *User friendliness and easy operation of the system, * This work is supported by the German BMWA InnoNet Program and the DFG
SPP-1124. 450
45 1
.Robustness against disturbances or incorrect operation, *Support of less invasive procedures to minimize trauma without increasing risks or Reducing accuracy, *Adaptability to various surgical procedures. The major goal of our work is to meet these demands with best performance by synthesizing the specific advantages of a navigation system and a mechatronic arm into a combined system, a “navigated mechatronic assistance system” (MAS).
2.
Materials and Methods
According to the requirements listed above, the whole system-concept is based on three main design rules: Modularitv: With a modular system-design it is very easy to adjust the system to new applications by plugging in the desired components Interactivitv: Through an interactive interface the system got much 2) more acceptance by its users than with a static built-in workflow. Thus the system is much better 3) Applicability in difficult interventions, where preoperatively no scheduling could be done. 4) Universality: The mechatronic assistance system must be adaptable to a wide range of tasks. 1)
2.1. Combining a 3 0 digitizing system and a mechatronic arm This concept has been investigated over several years. Existing solutions for computer aided surgery are either based on navigation or on robotics, which we considered as separate approaches or just loosely coupled [ 8 ] [ 9 ]In . contrast, we have developed an integrated approach to combine a 3D digitizing system and an active mechatronic arm [4][5], leading to a navigated mechatronic system. As illustrated by Fig. 1, a computer system has been configured to connect and to control both parts and to synchronize their operation. During system initialization, a short calibration procedure is carried out to align the coordinate systems. It is essentially based on evaluating the signals of a DRB(dynamic reference base -) element of the digitizing system which is attached to the wrist of the mechatronic arm. During the setup a transformation matrix is calculated which facilitates specification and execution of movements of the mechatronic arm with reference to the coordinate system of the digitizer.
452
Coiitrol system
t Fig. 1 Components of the navigated mechatronic assistance system
Tracking System
Arm
Fig. 2: coordinate - transformation
453 The first intra-operative step using the navigated MAS, the registration of the patient anatomy, can be performed without having to move the mechatronic arm, as in case of conventional operation of the existing surgical robots “Robodoc” and “CASPAR” (see [13][14] and survey in [6]). Instead, the surgeon uses the pointer of the 3D digitizing system to collect data in a fast and, as no active system directly touches the patient, safer way. The procedure is identical to the registration process applied for navigation systems and offers similar convenience and flexibility to measure either point or surface data sets, Tool-
-
Geometry
Cam Cam Arm TCP TArm. TTCP. TTooI T T ~ ~=I
-
Eq.l
Am Calibratim
Planning
COmTTar
=‘““Tpat .pat~mK.’mgTr,r u
Eq.2
Matching
-Tcp~Tar
=cam~Arm. Am~Tcp .TCPT 1; CamT ~ ~Tar
Eq.3
depending on the requirements of the matching algorithm which is used to match pre-operative and intra-operative data. The various coordinate systems of all sub-systems are shown in Fig. 2. When the coordinates of the anatomy to be operated have been determined intra-operatively and matched with pre-operative planning (Eq.2), the control computer calculates the target location in coordinates of the mechatronic arm (Eq. 1 ) and issues corresponding commands to the mechatronic arm to move the surgical tool to the target (Eq. 3). We are presently focussing on applications where it appears to be advantageous that the MAS does not perform surgery completely automatically, but just positions the instrument with pre-planned orientation. The mechatronic arm may thus be regarded as a controlled active actuator of a navigation system that substitutes for manual instrument guidance. The surgical tool is, for example, fixed on a one-degree-of-freedom carriage which is mounted at the wrist of the mechatronic arm and moved manually by the surgeon. In this way the surgeon on one hand has some force feedback of the working process and keeps control of the operation, while he can be certain on the other hand that the tool maintains its correct orientation and that there are no unintentional deviations caused for example by slipping or inhomogeneous bone structure.
454
Compared to standard navigation systems, the alignment between reference position and actual position of the tool is automated. The surgeon does not have to permanently change his eyes from the operating area to the computer screen where he has to monitor the instrument position, but he can fully concentrate on the operating area.
2.2. Automatic tracking of the MAS to the patientposition A special feature of the navigated MAS is its ability to automatically track possible movements of the patient. A rigid fixation of the patient structure to be operated, as in [6][10][13][14], is therefore not necessary. The underlying technology is based on using a DRB of the 3D digitizing system which is rigidly attached to the bone by a suitable fixation mechanism. Bone motions detected by the DRB are processed in the control computer which connects digitizing system and mechatronic arm, and new target coordinates for the mechatronic arm are calculated accordingly. Fig. 3 illustrates the patient tracking control loop. A unique feature of our system is given by the high closed loop bandwidth of 60Hz which enables tracking of possible patient movements in real-time
plannmg data
. .___ -_ force-torque-sensor
Polaris 3 0 digitizer
I
3 $2
mechatronic arm
1
2
.?
surgeonluser
Fig. 3: control loops of the system
I
455 2.3. Haptic control of the mechatronic arm The user interface to the MAS has been designed for highly interactive operation. The mechatronic arm is equipped with a force-torque sensor mounted at its wrist which detects external forces applied to the MAS. The surgeon can thus push or pull the MAS towards the desired operating region. That is done by just gripping the handle mounted to the wrist and guiding the arm towards the target area. So the mechatronic arm control has not to care about collisiondetections with lamps or any other obstacles in the OR. This integrates the robot seamlessly in the operating procedure, because there is no need to use any inputdevice like mouse, touch screen or keyboard. After coarse positioning by hand is completed, the system can be switched to automatic control and fine positioning will be carried out under computer control according to preoperative planning. It is possible to switch back to manual haptic control whenever the surgeon wishes to do so, for example if he wants to push back the mechatronic arm temporarily during the operation. As the FTS measures any force or torque applied to the surgical tools, too, the measurements can be used to give further information to the control unit to ensure safety. The conjunct control loop for tracking- and for haptic-control is illustrated in Fig. 3. 3.
Prototype System Setup
A prototype of the navigated MAS which has been set up in our lab is shown in Fig. 4. It consists of a “Polaris” digitizing system from NDI Inc., Canada, and a light weight (35 kg) mechatronic arm from Mitsubishi, Japan.
Fig. 4: Photo of the prototype system
456
The mechatronic arm is equipped with a tool coupler that picks up the modularized tools. A built in tool identification is used to avoid fault usage. For the first application of the navigated MAS, the system is tailored to support the implantation of the acetabular cup in a less invasive total hip replacement procedure. According to our approach to use the mechatronic arm as a tool positioning system instead of performing surgery automatically, a modified conventional reaming tool is mounted at the wrist of the mechatronic arm. It is operated manually by the surgeon, while the mechatronic arm updates the position and orientation of the tool in case of patient movements. Present results show that patient tracking is possible in real time. Furthermore, the system prevents the surgeon from reaming too deep into the acetabulum. A more detailed, medical-orientated description of the preclinical evaluation of the system is given in [5]. As universality is one of our main design rules, ongoing development cares about the MAS-extension to support firther applications. Examples are endoscope guidance, milling in the lateral skull-base (ENT), or spine-surgery, for which tools are under development.
4.
Results
First pre-clinical and clinical tests have proved that in the first realised application - implantation of artificial hip cup prosthesis - a significant improvement compared to pure manual surgery can be achieved by using the presented surgical assistant system. The performed interventions showed that the system slightly extends the intervention time by 10 to 15 min but due to the exact tool positioning the contact area between bone and implant can be increased significantly. Deviations of more than 10 degrees from the pre-planned cup orientation, which may occur occasionally when the artificial hip cup is implanted manually, can be avoided certainly. During the milling into the hip bone and the positioning procedure a deviation from less than 1 degree in tool orientation and a deviation from less than lmm from the pre-planned position can be ensured, as measured with the digitizing system [7]. Fig. 5 shows the tool-angle error recorded during the milling process [12]. The accuracy achieved so far is sufficient for orthopaedic applications in any case. Due to the modular concept other control strategies or other devices for position measurement [7] can be integrated easily to increase precision for other applications, if necessary.
457
Tool-angle error during the millingprocess
1
0.1
10.7
31.9
41.1
50.3
59.6
Mlllmgtirne/ s a n d s
Fig. 5: tool-angle error during the milling process (12)
5.
Discussion
A new concept for a navigated surgical mechatronic assistance system is presented. Its functionality can be considered as an advanced interactive tool which enables the surgeon to transfer preoperative planning to intra-operative results with high precision and reproducibility, thereby combining the advantages of a navigation system and an active mechatronic arm. It is not necessary to fix the patient during the intervention because the MAS tracks any occurring patient movements automatically. The MAS-supported implantation of the acetabular cup in total hip replacement is currently evaluated intensively as a first application. Based on promising results achieved so far, including successful clinical trials, further software modules and tool components for other applications of the navigated MAS, e.g. for knee- and spine-surgery, will follow.
References 1. Cinquin P.et al., Computer assisted medical interventions, IEEE Engineering in Medicine and Biology, Vol. 14, 1995, pp. 254-263. 2. Davies B., Robotics in surgery, Special issue, IEEE Engineering in Medicine and Biology Magazine, Vol. 14, 1995 3 . Dario P. et al., Robotics for medical applications, IEEE Robotics and Automation Magazine, Vol. 3, 1996, pp. 44-56 4. Wahrburg J., Kerschbaumer F., Using Robots to Increase the Accuracy of Surgical Interventions, Proceedings 24th Conference of the IEEE Industrial Electronics Society (IECON '98), 1998, pp. 2512-2516 5. Kuenzler S., Knappe P., Gross I.; Pieck S., Wahrburg J., Kerschbaumer F., Evaluation of a new navigated robot system for minimally invasive total hip alloarthroplasty, Presented at Surgetica, Grenoble, 2002
458
6. Lueth T., Bier J., “Robot Assisted Intervention in Surgery”, in: Gilsbach J.M. and Stiehl H.S. (Ed.), Neuronavigation and Computer Scientific Aspects, Springer Verlag, 1999 7. Chassat F., Lavallee S., ,,An experimental protocol for accuracy evaluation of 6D localizers for computer assisted surgery: Application to four optical localizers.” in: proceedings of the first Medical Image and ComputerAssisted Interventions - MICCAI ’98. Wells W.M., Colchester A., Delp s. editors. Lecture Notes in Computer Science vol. 1496 Springer Verlag. 8. Lueth, T.C.; A. Hein, J. Albrecht, M. Demirtas, S. Zachow, E. Heissler, M. Klein, H. Menneking, G. Hommel, J. Bier : A Surgical Robot System for Maxillofacial Surgery. IEEE Int. Conf. on Industrial Electronics, Control, and Instrumentation (IECON), Aachen, Germany, Sep. 1998, pp. 24702475. 9. Brandt G., Zimalong A., Carrat L., Merloz P., and other, “A compact robot for image guided orthopaedic surgery” IEEE Transactions on Information Technology in Biomedicine, Vol. 3, No. 4, 1999, 252-260 10. Pranski J., “Surgeons’ realisations of RoboDoc”, Industrial Robot, MCB University Press Vol. 25, No. 2, 1998, pp. 105-108, 11. Taylor R.H., Lavallee S., Burdea G.C., and Moesges R. (eds.), Computer Integrated Surgery, MIT Press, Boston, 1996 12. Knappe P., Gross I., Pieck S., Kuenzler S., Kerschbaumer F., Wahrburg J.: Position control of a surgical robot by a navigation system; IEEERSJ International Conference on Intelligent Robots and Systems (IROS 2003); Las Vegas, USA, 2003, pp. 3350 - 3354 13. Kazanzides P.: Robot Assisted Surgery: The ROBODOC Experience, Proceedings 30th International Symposium on Robotics, TokyoIJapan, 1999, pp. 281-286, 14. Grueneis, C., Richter, R., et al., 1999, Clinical introduction of the CASPAR system: problems and initial results. Proceedings of 4th Int. Symp. Computer Assisted Orthopaedic Surgery, CAOS ‘99, Davos, Switzerland. 17-19 March 1999.
Visualization and Augmented Reality
This page intentionally left blank
CALIBRATION OF A STEREO SEE-THROUGH HEAD-MOUNTED DISPLAY
SASSAN GHANAI, GEORG EGGERS, JOACHIM MUEHLING, RUEDIGER MARMULLA AND STEFAN HASSFELD University Heidelberg, Oral and Maxillofacial Surgery, I m Neuenheimer Feld 400, 0-69120 Heidelberg, Germany E-mail:
[email protected]
TOBIAS SALB AND RUEDIGER DILLMANN University Karlsruhe ( T H ) , Institute f o r Computer Design and Fault Tolerance, Haid- und Neu-Strasse 7, 0-76131 Karlsruhe, Germany E-mail:
[email protected] In this paper, we present recent development and pre-clinical validation results of our approach to calibrate a stereo see-through head mounted display for augmented reality (AR) in craniofacial surgery. To achieve this goal a standard NDI Polaris navigation system and a commercial Sony Glasstron LDI-100D AR-glasses are used. Our software the INPRES system (intraoperative presentation of surgical planing and simulation results)’ controls the navigation system and displays the AR into the glasses, which is developed at the Department of Computer Science, Karlsruhe. First evaluations of the calibration process in the laboratories present promising results. Another study in the operation room (OP) showed further opportunities for improvement.
1. Introduction Craniofacial surgery is a highly complex medical discipline with various risks for the acting surgeon and the affected patient. Preoperative planning of intervention demands a high degree of spatial sense of the surgeon. In the past, a 3D-patient-model had to be rendered from CT-data, which can be used for further preoperative planning. The goal of the INPRES System is to support the spatial cognition by projecting the patients’ data onto the glasses which are worn by the surgeon. Furthermore the displayed stereo image should overlap the real patient. To provide a realistic superposition 461
462
of the ct-model with the patient a precise registration and calibration is essential. In order to achieve this without using any additional hardware, different parameters have to be asserted and entered into the system. In the past few years many AR-systems have been developed for different medical application. The University of North Carolina, presented an AR system for assisting in laparoscopic surgery2. In this system, the video of the endoscope and the patient have been matched and displayed in a see-through head mounted display. Also relevant patients’ data, like the position of a biopsy-needle, could be projected into the shown scene. A portable operation-microscope4 based system has been developed a t the University of Wien. This ”varioscope” benefits from the optical advantage of the microscope, like auto-focus and low weight. It has already been successfully evaluated with phantom-skulls. Another approach for assisted biopsy based on see-through glasses is realized by Siemens3. A different attempt is the system of the University of Karlsruhe, which uses a videobeamer for projecting the AR onto the patient5. Fuhrmann et a1 introduced the ”fast calibration” method, which is one of the cameraless calibration systems. It displays four virtual crosses for each eye which have to be matched one by one with a tracked handheld reference cross. Once the size and orientation of both, the real and the virtual cross, are identical, a button is pressed t o mark the position of the reference cross7. This button is situated a t the handle of the handheld reference cross. A different cameraless calibration was presented by Genc8.
2. Assembly
The experimental hardware setup of the INPRES system has been built up a t the Department of Computer Science, Karlsruhe. The clinical evaluation will be accomplished a t the Department of Oral and Maxillofacial Surgery, Heidelberg. The current environment consists of an aluminium rack with a 1,35m x 1,60 m table (see figure 1 left). For the superposition of the virtual data with a real patient we use a commercial stereoscopic optical see-through head-mounted-display of type Sony Glasstron LDI-D100BE (see figure 1 right. This display offers SVGA resolution(800x600 pixels) with 1.55 million dots per LCD. It weights 120 g and allows to modify the brightness, contrast, hue and color behavior. Data provided in (S)VGA mode or as video signal in (S)VHS format can be displayed. Through page flipping with 85 Hz it is possible to realize a stereo visualization. Further it is possible to adjust the distance between the display and the eyes and the
463
Figure 1. Left:INPRES-scenario (from left x-shaped model, calibration object, Sony Glasstron and in front pointer). Right:Sony Glasstron
angle relative to the viewing axis of the eyes. For navigation a commercial NDI Polaris tracking system with a P4 hybrid camera unit is used as a standard, together with active and passive tracker and an active pointer device. Data processing is performed on two PCs, one running with Windows XP, the other one with SuSe Linux. The inter-process communication is provided through a C O M A interface. Beside the visualization of the virtual environment, INPRES handles the communication of the polaris, the registration and calibration process.
3. Calibration Through the INPRES System, the surgeon can visualize the patient’s data preoperatively. Risk areas, which have been marked during the planning phase will be accentuated during the operation. This will help t o minimize the risks of an operation. Essentially for a good superposition is a good registration of the tracked objects and a good calibration of the AR system. Different parameters have to be calculated for a good calibration. These are mainly: 0
0
eye-distance relative distance between eyes and display of the glasses angle of the viewing-axis relativ t o the displays
464
0
0
orientation of image plane aspect ratio (horizontal) field of view
Our model takes in account that the variability of positioning the glasses can be preserved. After finishing the registration, the calibration will be initiated. The calibration process can be divided into three parts. 0 0
matching the displayed with the real reference cross evaluating the parameters optimizing the parameters
4. Improvements and Methods
Figure 2. pyramid
Left:Alignment of the virtual cross with the reference cross. Right: seeing-
Applying the idea of Fuhrmann et al, our calibration process uses eight virtual crosses for each eye t o collect the needed parameters. These are for instance the eyes distance, the nodal point of the eyes, the distance between the middle point between the nodal points and the projection plane of the glasses. The nodal point is simplified the point where the picture of an object flips over. This point can be calculated by forming the intersection of different seeing axis, which arise between the real and virtual referencecross and the eye of the observer. For each eye eight crosses (P1...P8) were marked based on the notch and bead sights principle. The reference cross is situated on an object, which is tracked by the navigation system. This object may be mounted on a tripod for readjust the height onto the eyelevel of the observer. In addition the eight crosses were divided into two
465 groups of four which are placed onto two planes of a seeing-pyramid (see figure 2 right). The real cross is placed on a tracked reference object. For marking the position of equivalence of the real and virtual cross, a button of a pointer device has to be pressed. Through trigonometric conversion of the scatter-plot, the needed parameters can be calculated. Due t o human approach of a point, it might result in an error to evaluate the nodal point. This failure can be optimized by using a least-squares method to match the scatter-plot with the displayed point, which lies on the seeing-pyramid. Through this the effect of possible faulty points can be minimized or even compensated. This mainly depends on the number of faulty points and their positions to each other. 4.1. Results and discussion
Figure 3. Left:Displaying a trajectory onto x-shaped model
0 s frontale.
Right: Superposition of the
The quantitative analysis of a calibration of an optical see-through glasses is not a trivial task, due to the fact, that we don’t have access to the image formed on the retina of the observer. Also a validation with the aid of a scale appears to be unpractical, due t o the problem focussing two objects in different planes. Thus, either the displayed object becomes fuzzy and the scale sharp or vice versa. This results in harmonised values. The figure 3 shows a projection of a trajectory and a superposition with the x-shaped model. In order to evaluate the achievements engineers and doctors were asked to rate the complexity, functionality and quality of superposition. To do so they were requested to pass through the whole registration and calibration
466
process. The quality of superposition were estimated with the x-shaped model. It could be observed, that people with experience in augmented reality could accomplish a better superposition than others. The translatory variation were about 1,94 mm for the first group with the experienced proband. The other group showed a translatory error of 5,88 mm. The rotatory difference amounts of 2,02 mm for the second group. Whereas the first group didn’t sense a rotatory error. Concluding the experienced person has a better cognition for a good superposition. Another difference can be found by the assessment of the system through the proband. The technical experienced rated the calibration process more complicated than the medical staff. Whereas the medical experienced persons estimated the superposition as not precise enough due to a deviation in the dimension between the real and the virtual object. Furthermore some of them report about focussing problems after calibration. The first evaluation of the system showed minor weak points, which will be modificated in the further developement process. The handling of the system is one of the main points. Another point, the wearability of the head-mounted display, has been improved after the evaluation. The calibration process itself still shows some potential for improvementsg. A second test stage with an improved system will be carried out shortly. Recapitulating: Due to the replacement of the handheld by the tracked reference cross, handling of the calibration process became simpler, more comfortable and more accurate. Further the calibration object can be used for the first rating of the calibration right after ending the calibration process. Also the variability of the glasses has been preserved. From the medical’s point of view, the system, once it would work precisely, could become a good tool to supply the surgeons spatial vision. The ability to look through the glasses and see the real environment give the security that the system can not hide single parts from the scene. On the other hand the calibration of these type of glasses is more complex and requires a high amount of accuracy while the user matches the virtual with the real reference-cross. Up to now, even with a good calibration of our system it is not possible to use the system in the operation room. The reason is that the displayed virtual object is mostly a bit smaller than the real object.
467
5. Acknowledgement
This study was supported by the Deutsche Forschungsgemeinschaft within the collaborative research center 414 Information Technology in Medical Science. Computer- and Sensor- Supported Surgery
References 1. T. Salb et al.: INPRES (intraoperative presentation of surgical planning and simulation results) - augmented reality for craniofacial surgery. In J. Merritt et al., Hrsg., Tagungsband: SPIE Electronic Imaging. International Conference on Stereoscopic Displays and Virtual Reality Systems, San Jose, CA, 2003. SPIE Press. 2. M. Rosenthal et al.: Augmented reality guidance for needle biopsies: A randomized, controlled trial in phantoms. In W. Niessen et al., Hrsg., Tagungsband: International Conference on Medical Image Computing and ComputerAssisted Intervention (MICCAI), Seiten 241-248, Utrecht, Niederlande, 2001. Springer-Verlag. 3. F. Sauer et al. A head-mounted display system for augmented reality image guidance: Towards clinical evaluation for iMRI-guided neurosurgery. In W. Niessen et al., Hrsg., Tagungsband: International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI), Seiten 707-716, Utrecht, Niederlande, 2001. Springer-Verlag. 4. W. Birkfellner et al. A head-mounted operating binocular for augmented reality visualization in medicine - design and initial evaluation. IEEE Transactions on Medical Imaging, 21(8):991-997, 2002. 5. H . Hoppe et al. Intraoperative visualization of surgical planning data using video projectors. In J. Westwood et al., Hrsg., Tagungsband: Medicine Meets Virtual Reality (MMVR), Seiten 206-208, Newport Beach, CA, 2001. 1 0 s Press and Ohmsha. 6. A. Fuhrmann et al. Fast calibration for augmented reality. 1999. 7. A. Fuhrmann et al. Practical calibration procedures for augmented reality. Tagungsband: Eurographics Workshop on Virtual Environments, 2000. 8. Y. Genc et al. Practical solutions for calibration of optical see-through devices. Tagungsband: IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR), Seiten 169-175, Darmstadt, 2002. IEEE CS Press. 9. T. Salb et al. INPRES (intraoperative presentation of surgical planning and simulation results) - augmented reality for craniofacial surgery. In J. Merritt et al., Herausgeber: SPIE Electronic Imaging. International Conference on Stereoscopic Displays and Virtual Reality Systems, San Jose, CA, 2003. SPIE Press.
COMBINED TRACKING SYSTEM FOR AUGMENTED REALITY ASSISTED TREATMENT DEVICE WU RUOYUN Clinical Research Unit, Tan Tock Seng Hospital I I , Jalan Tan Tock Seng, Singapore 308433 MD IRWAN MD KASSIM Clinical Research Unit, Tan Tock Seng Hospital I I , Jalan Tan Tock Seng, Singapore 308433 WEE SIEW BOCK General Surgery Department, Tan Tock Seng Hospital 11, Jalan Tan Tock Seng, Singapore 308433 NG WAN SING Computer Integrated Medical Intewetion Lab School of Mechanical & Production Engineering Nanyang Technological University,50, Nanyang Ave, Singapore 639798
In this paper, we present a cost-effective tracking system, aiming to provide accurate enough positioning information for breast biopsy and surgery, via Augmented Reality (AR) and Visual Reality (VR) visualization. The tracking system makes use of both stereo-camera and encoder to simultaneously track two objects: ultrasound probe and surgical tool. Stereo-camera, which is a must-have for AR application, were used to track an optical marker mounted on ultrasound probe. However, camera-based tracking method has some inherent drawbacks, like inefficiency in tracking multiple objects, line-of-sight requirement, lack of robustness, etc. To solve these problem, encoder-based tracking method is used as a complement. Rotary encoders are installed in each joint of articulated arms, which hold the ultrasound probe and surgical tool. These encoders tell the angle of each joints, therefore, the position of both ultrasound probe and surgical tool with respect to the base of the arm can be computed. The prototype of the system proved the concept and future work is expected to improve the accuracy.
1. Introduction
Tracking surgical tool and patient is a common task for computer-assisted surgery. Nowadays, infrared-based tracking system like Optotrak from Northern Digital Inc, Canada, provides sub-millimeter accuracy in a real surgical context. However, such kind of tracking systems is often expensive, and has line-of-sight (LOS) requirement. Electromagnetic tracking system like Aurora, also from 468
469
Northern Digital Inc, does not have LOS requirement, but the accuracy is lower, and is sensitive to electromagnetic disturbance, e.g., the present of metal. Our research project aims to use minimally invasive breast biopsy device, such as Mammotome, from Johnson and Johnson, USA, to carry out Lumpectomy, the most common surgery procedure for removing localized breast cancer. In a conventional Lumpectomy, surgeon removes the tumor plus lcm margin of healthy tissue by open surgery. By using minimally invasive devices, the healing time and the risk of infection could be reduced. In order to turn a biopsy device into a treatment device, the procedure of the tissue cutting must be well controlled. This can only be done with accurate cutting planning and needle positioning. The feedback from surgeon shows that 1-2mm accuracy in positioning should be sufficient, as the typical tumor size for Lumpectomy is about 5-15mm in diameter, and there is lcm margin of healthy tissue to remove as well. Since minimal invasive approach limits the access of the surgeon in visually perceiving the malignant tissue in direct, Augmented Reality (AR) and Virtual Reality (VR) are introduced to provide intuitive visual guidance for the surgeon. 3D ultrasound imaging is used for intra-operative guidance to locate the breast tumor. As the ultrasound image has known position with respect to the ultrasound probe, tracking the tumor within the image is equivalent to tracking the ultrasound probe. Similar project has been done by other researchers using Optotrak as tracking device [l]. The tracking configuration is shown in Figure 1. The Optotrak tracks camera, ultrasound probe, and surgical tool simultaneously, so that all of them can be referenced in a single coordinate system. Camera
US probe Optotrak
Surgical Tool
Figure 1 Tracking configuration using Optotrak. The Optotrak tracks v,,,,, simultaneously.
vurrand vt,,
We notice that vector v,, and vto0lhave higher requirement on accuracy than v., Vector v,, and vtoolactually describes the relative position between the surgical tool and the patient, which is critical for a successful surgery, contributing to both AR and VR. On the other hand, vector vcameraonly
470
contribute to AR. For the AR visualization in our application, l~2mm accuracy for X/Y and 4~6mm for Z should be sufficient, as when they are translated into pixels on the AR display device, it is only very few pixels' of difference. This motivates us to build a tracking system that is relatively cheap and yet deliver sufficient accuracy for our application. A prototype system has been built with both camera tracking and encoder tracking methods, to deliver both AR and VR visualization to the user. 2.
Method
2.1. Tracking Method Table 1 shows a brief comparison of tracking methods we have evaluated. Camera tracking is a natural choice for us, since a stereo-camera is a must-have for the AR visualization. However, camera tracking has the following disadvantages: relatively low accuracy, low speed because of the stereo video processing, and LOS problem. To compensate that, we use the encoder tracking at the same time, as a complementary measure. Table 1 The comparison of tracking methods. The tracking target for camera is placed at a distance of about 1 meter. The tracking target for Optotrak is placed at a distance of 2 meters. The accuracy and speed of encoder tracking may vary in a big range, depending on the design and fabrication of the arm and the encoder adaptor used.
Category Accuracy Speed Range Target LOS Robustness Cost
Camera Tracking l~2mm (X/Y), 2~4mm (Z) 4-1 OHz 0.5~2m Single Yes Low Low
Encoder Tracking 0.1~2mm 15Hz~4MHz Limited Multiple No High Low
Optotrak 0.1 mm (X/Y), 0.15mm(Z) 450Hz 2~3m Multiple Yes Medium High
Figure 2 shows the tracking configuration when using both camera tracking and encoder tracking. Based on a common main arm, two sub-arms link to the US probe and the surgical tool, and the encoders track vector vus and vtooi. The camera tracks vector VAR. Vector vcamera can be computed from vus and VAR. As mentioned above, encoder tracking provides high accuracy in vus and vtooi for AR/VR visualization, and the accuracy of vcamera would be lower because of camera tracking, but it is sufficient for AR visualization.
471 Camera
Main Arm
Figure 2 Tracking configuration using the combined tracking method. Encoders track vurand v,,,,, while the camera tracks VAR. Vector v,,,~ can be computed from v,, and VAR.
Figure 2 is a direct mapping from Figure 1. Camera, ultrasound probe, and surgical tool are referenced in a single coordinate system. Later in this paper we will discuss a simplified tracking configuration, which is an improvement over the one in Figure 2. 2.2. System Overview
The system tracks the ultrasound probe and surgical tool, and conveys this position information to the surgeon via AR and VR visualization. The system also holds the ultrasound probe and the surgical tool, using two sub-arm called ultrasound arm and surgical arm respectively. There are also motors to control the surgical tool. Figure 3 illustrates how the system would be used in the operating theater. Figure 4 is the picture of the system in its Alpha version.
d !
Digiclops
/
Figure 3 Conceptual drawing of the system in the operating theater.
Touch Screen
472
Figure 4 Alpha version of the system in December, 2003. The VR visualization on the touch screen shows the posture of the arm system in real time.
2.3. Camera Tracking A commercial stereo camera - Digiclops from Point Grey Research, Canada, is used to track a marker template with printed three-dot pattern, as shown in Figure 5.
Figure 5 Left: The marker for camera tracking is mounted on the holder of 3D ultrasound probe. Right: A sample augmented reality display; the arrow points to a virtual tumor within the box representing the 3D ultrasound image. They are supposed to be displayed under the ultrasound probe (not shown in the picture).
473 Digiclops has a compact size, with 3 cameras arranged in an L shape. It is precalibrated, and bundled with software libraries for video acquisition, image processing, stereo processing, and marker tracking. According to our experiments, the accuracy in tracking the three-dot marker shown in Figure 5 is about 1 2mm (Xrr) at a distance of about lmeter, and about 2 3mm in Z direction (the camera’s view direction).
-
-
2.4. Encoder Tracking Encoder-based tracking method has been widely used in commercial measurement arms. The latest model, like FaroArm - Platinum from FAR0 Technologies Inc, USA, can even reach 0.0005” accuracy. There are 4 encoders in the ultrasound arm, and 7 encoders in the surgical arm. The encoders we are using provide a resolution of 2048 lines per cycle (0.044 degreelcount), with a size compact enough to fit into the arm joints. Each encoder track one degree of freedom of the arm. As shown in Figure 6, with the encoder tracking alone, the whole arm system can be visualized in the VR environment. VR has the advantage of unlimited zoom rate, so that details of even sub-millimeters can be clearly visualized.
Figure 6 Left: The VR visualization showing the whole arm system based on encoder tracking alone. Right: A close view of the 3D ultrasound volume with tumor surface model inside. The biopsy needle also interactively slices the ultrasound volume when the volume intersects with the virtual rectangle plane of the needle.
2.5. Protocol of Tracking Setup Referring to Figure 2, our protocol of tracking setup is the following: Encoder tracks vector v,, and vtoolcontinuously for the whole 1. procedure.
474
2.
User positions the main arm so that the two sub-arms can cover the region of interest (ROI). Then the main arm will be fixed. 3. User positions the camera so that the video covers the ROI sufficiently. Once VAR is detected, v,,, can be computed. Then the camera will also be fixed. 4. By fixing both the main arm and the camera, vector vCamera is fixed. Therefore, it is no need to continue camera tracking. The optical marker can be removed and leave only encoder tracking. The protocol above uses camera tracking only in the beginning. This avoids the camera-tracking problems, such as LOS requirement and sensitivity to environment changes (notably illumination), in the rest of the procedure. 3.
Discussion
3.1. Improving the Accuracy of Encoder Tracking The Alpha version of our system proves the concept of tracking configuration in Figure 2. However, the encoder tracking did not achieve our desired accuracy. We have identified the problem in the design and fabrication of the arm system. In order to track v,, and vtool,there are 11 encoders in total between the ultrasound probe and surgical tool. Many of the arm components are long, in order to provide enough range of movement for the ultrasound probe and surgical tool. These lead to less accuracy in tracking v,, and vtwl,which in turn gives lesser accuracy in vSurgery (see Figure 7). Camera
US Probe Main Arm
"sutgury
Surgical Tool
Figure 7 Simplified tracking configuration using combined tracking method. Using an arm linkage between the surgical tool and the ultrasound probe, the encoders can track vsurgeW directly.
Figure 7 shows a simplified tracking configuration. Vectors vCamera, vUs,and vtoo1 will no longer be monitored. An arm linkage will be built from the ultrasound probe directly to the surgical tool. The linkage will be short, as the surgical tool is close to the ultrasound probe during the surgery. Furthermore, only 4-5
475
encoders are needed in the linkage, as there are some constraints on the positioning of surgical tool with respect to the ultrasound probe. Since vus,and vtOolare not monitored, the simplified tracking configuration will not be able to visualize the whole arm system in VR visualization. However, with vAR and vsurgery, it is sufficient to display the tumor and the surgical tool in both VR and AR visualization. Anyway, displaying the tumor and the surgical tool accurately is much more important than displaying the whole arm system with lesser accuracy. 3.2. Head-mounted Display Device Head-mounted display (HMD) is a common device used in AR applications [2]. In order to use HMD device, the stereo camera has to be head-mounted as well. However, Digiclops is not designed as a head-mounted camera. Another issue of using HMD in Figure 2 and Figure 7 is that the camera has to perform continuous tracking, but our current camera tracking is not robust enough for doing it. Customized head-mounted stereo camera with robust tracking capability, such as those track infrared markers [3], may be built to support HMD.
Acknowledgments This work is supported by National Medical Research Council of Singapore under research grant NMRC 063412002.
References Y. Sato, M. Nakamoto, Y. Tamaki, T. Sasama, I. Sakita, Y. Nakajima, M. Monden, S. Tamura, Image guidance of breast cancer surgery using 3-D ultrasound images and augmented reality visualization, IEEE Transactions on Medical Imaging, Volume: 17, Issue: 5,681 - 693 (1998). M. Figl, W. Birkfellner, J. Hummel, R. Hanel, P. Homolka, F. Watzinger, F. Wanshit, R. Ewers, H. Bergmann, Current status of the Varioscope AR, a head-mounted operating microscope for computer-aided surgery, Proceedings of IEEE and ACM International Symposium on Augmented Reality, 20 - 29 (2001). F. Sauer, A. Khamene, B. Bascle, L. Schinunang, F. Wenzel, and S. Vogt, Augmented reality visualization of ultrasound images: system description, calibration, and features, Proceedings of IEEE and ACM International Symposium on Augmented Reality, 30 - 39 (2001).
VIRTUAL REALITY, AUGMENTED REALITY AND ROBOTICS IN SURGICAL PROCEDURES OF THE LIVER Luc SOLER’, Nicholas AYACHE’, Stephane NICOLAU’.’, Xavier PENNEC’, Clement FOREST’.’, Herve DELINGETTE’, Didier MUTTER’, Jacques MARESCAUX’
I :IRCAD/EITS, I place de l’hbpital, F - 67091 Strasbourg 2 :EPIDAURE Project, INRIA, 2004 Route des Lucioles, F - 06902 Sophia Antipolis Mcdical imagc proccssing Icd to a major improvcmcnt of paticnt carc. Thc 3D modcling of patients from thcir CT-scan or MRI thus allows a bcttcr surgical planning. Simulation offcrs thc opportunity to train thc surgical gcsturc bcforc carrying it out. And finally, augmcntcd reality providcs surgcons with a vicw in transparcncy of thcir patient which, in the futurc, will allow to obtain automation of thc most complcx gcsturcs. Wc prcscnt our latcst rcsults in thcsc differcnt ficlds of rcscarch applicd to surgical proccdurcs of thc liver.
1.
Introduction
During the 20thcentury, medicine witnessed the appearance of a new tool, which revolutionized it : 3D medical MRI (Magnetic Resonance Imaging) or CT (Computer Tomography) imaging. Though acquisition techniques keep on evolving, reading and understanding these images often remain highly complex skills. Progress achieved in computer technology allowed to solve part of these reading difficulties by translating the information contained in the image into a 3D or 4D (i.e. 3D + time) image of the patient anatomy and pathologies. Thanks to an improved preoperative knowledge of each patient’s internal anatomy, practitioners can today establish an improved diagnosis and a better planning of the best suited therapy for a given case. Therefore, 3D modeling of a patient is generally used for diagnosis support or surgical planning tools. The other use is patient follow-up over time, easing visualization of therapy efficiency. However, surgical simulation still remains limited to virtual models, without really exploiting medical data of patients. Thus, to simulate an intervention on a virtual patient reconstructed from hisher medical image is nevertheless a major research which would allow to noticeably reduce medical mistakes thanks to an optimized preoperative training. Intraoperative use, which allows to improve the surgical gesture, has to be added to the preoperative use of 3D patient modeling. In this domain, augmented reality offers a more efficient steering of the surgical gesture, by superimposing preoperative information on the patient. Most applications have been developed in neurosurgery and in orthopedic surgery, since non-mobile bone landmarks are very reliable and allow an eased registration of the virtual 476
477
patient on the real patient. The few works that have been carried out on the abdominal region provide only little accurate information, because of possible organ movements due to breathing. In order to overcome limitations of 3D medical image analysis, preoperative simulation and augmented reality on organs and pathologies of the digestive system, we have been developing a set of tools over the last 8 years. Their objective is to provide support to surgeons of the digestive system, both before and during the surgical intervention. We are presenting those tools hereinafter.
2. 3D Modeling of Organs and Surgical Planning The 3D reconstruction of patients from their CT-scan or MRI medical imaging is one of the main research topics in the field of medical image processing. Most systems allow to reconstruct anatomical and pathological structures from interactive systems, except for some of them, that propose automated systems allowing a real use in clinical routine, where processing time has to be kept at a minimum. The digestive system is one of the most complex regions to analyze, because of the great number of neighboring soft organs, that all have a very close density. In the case of the liver, main organ of the digestive system, radiologists use CT-scans, which are realized 60 seconds after the intravenous injection of a contrast medium. Theses images allow to visualize hepatic tumors, that are hypodense in the images, contrasted vessels are green as well as the liver, which has an intermediary grey level whereas it usually is higher than the one of surrounding organs. Despite these visible variations, liver delineation remains a highly complex procedure since its localization and its shape can vary considerably (its average density can vary between 70 to 150 HU). Several authors have proposed to delineate the liver with automatic [l-31, or semiautomatic methods [4-51. These methods provide bad results in atypical-shaped liver or when the liver contains big capsular hepatic tumors, the only health parenchyma being detected. We have developed a novel method [6] which detects parenchyma and tumors from techniques developed in the works of Bae et al. [3]. In order to obtain reliable and automatic results, we have added the prior segmentation of neighboring organ. This way, we obtain a more efficient and still automatic method, which provides within 15 minutes from a CT-scan with a 2mm thickness and that has been taken 60 seconds after injection of a contrast medium : skin, bones, lungs, heart area, aorta, spleen, kidneys, gallbladder, liver, portal vein and hepatic tumors from 5 mm diameter [6].
478
In order to use this reconstruction, we have also developed a surgical planning system [7] working on a standard multimedia computer. Indeed, delineation of each organ allows us to deduce for each one of them a small triangular meshing and to display it on any 3D card that is compatible with the OPEN GL standard. Besides 3D visualization of delineated and modeled structures, this system allows to put each structure in transparency, to interact on them, to navigate anywhere and thus to simulate any kind of caelioscopy. It also allows to realize virtual resections defined by interactively positioned cutting plans and to provide the volume of all visualized structures (figure 1). Because of its compatibility with current standards, this system can be used on a laptop fitted with a 3D graphic card and can thus be used during the intervention so as to improve the control of the carried out gesture.
Figure 1 : Virtual resection of a 3D reconstructed liver of a patient from hisiher medical images and intra-operative clinical use of the planning software on a laptop.
3.
Surgical Simulation
Thanks to minimally invasive surgery, surgical simulation witnessed a very important expansion over the last years. This surgical technique consists in operating a patient with long instruments inserted into the patient’s body, while looking at a screen linked to a camera for visualizing the interior of the operated cavity. Such a surgery offered thus a simulation favourable development field, since it is possible to replace the camera image by a virtual image of the patient. Furthermore, since the surgeon operates with instruments, it is also possible to link these instruments to motorized systems reproducing haptic sensations of force feedback. The first simulator of this type is the Minimally Invasive Surgery Trainer (MIST), developed in 1995 by the company Virtual Presence Ltd, which is today commercialized by Mentice (www.mentice.com). Since then, a great number of companies took up surgical simulation. Some products propose deformable models, usually surfacic ones. The most used deformation model is the spring-mass model because of its easy implementation. If proposed, cutting
479
only concerns surfacic organs. The most simulated surgery is the cholecystectomy, available on LapChole from Xitact (www.xitact.com), LapSim from Surgical Science (www.surgica1-science.com), LapMentor from Simbionix (www.simbionix.co.il), or RLT from Reachln (www.reachin.se). Some of these companies have developed material platforms that can be used for simulating abdominal surgery, gynaecologic or arthroscopic procedures. Various scenarios are then available as separate modules. The increasing realism of visual rendering, due to the use of textures obtained from real images, and the progress in force feedback mechanisms enabled those products to acquire some maturity. All these simulators propose training exercises for learning basic gestures. Pedagogy follows a step by step training logic allowing in the end to realize a more complex gesture including a grading. Though they are attractive, these simulators are however limited to the simulation of restricted and determined virtual models in a set database. Thus, they do not exploit the 3D reconstruction of patients that can be realized from the medical image, and do not allow to prepare a surgical intervention on a copy of the patient. Furthermore, these simulators do not feature the opportunity to cut a voluminous organ like the liver and are in fact limited to learning basic gestures. From the 3D patient modeling, we propose to realize a realistic simulator allowing to prepare and simulate a surgical intervention before actually carrying it out [8]. Thus, our objective is to realize and validate a highly realistic simulator for the laparoscopic surgery of the liver, including realistic physical and visual modeling of organs, real-time force feedback and the opportunity to change patient topology thanks to broad resections realized on any region of the reconstructed patient. In order to realize such a simulator, it is important to reproduce sensations linked to carrying out surgical gestures. The first version described in [9] only allowed to simulate main prehension and resection gestures by applying an electric bistoury. In order to improve visual quality of the simulator, we worked on the realistic aspect of the cut region, by adding a local regeneration of the meshing that therefore provides a convincing simulation of ultrasound resection (figure 2 and [lo]). We maintain meshing variety : this is the true novelty of this method [lo]. This property is important since it allows us to compute the normal at each surface tip of the meshing, and thus to improve the stability of haptic interaction. Furthermore, this variety allows us to use a structure of simplified data for meshing, which is optimized for force computation. We also merged two different deformation algorithms in a same meshing : an explicit method called mass-tenser and a pre-computed method [9]. Thus, the virtual organ is separated into several volumetric regions in which the
480
deformation method (mass-tenser or pre-computed) is chosen according to the interaction nature with the virtual tool (no interaction, deformation, cutting). This method is based on the domain decomposition theory and is optimized so as to minimize the amount of communication between two neighbouring regions. Our current research works aim at parallelizing this method, what will allow to speed up computation time.
Figurc 2: Illustration of a rcscction showing the realistic aspcct of cutting and vascular clamping.
4.
Augmented Reality: The Transparent Patient
One of preoperative planning and simulation limitations is the difficulty of accurately reproducing the planed and simulated gesture on the real patient. This limitation can be overcome by superimposing preoperative data on the real patient during intervention. However, this superimposition is complex to achieve in practice since it requires the accurate correspondence of reference landmarks between the virtual and the real patient. We have developed a set of tools so as to obtain a reliable result that can be used in clinical routine. Therefore, we propose to offer a view in transparency of the patient by superimposing the 3D virtual patient reconstructed from MRI or CT medical images, on the video image realized during the intervention. In order to retrieve the constraint linked to deformation and movement of organs of the abdominal area due to patient breathing, the medical image and the video image can be realized under general anesthesia with a constant air volume inside lungs. Such a procedure is observed in practice for needle insertion interventions, such as radiofiequency thermal ablation of hepatic tumors. Thanks to these restrictions, abdominal organs have the same position between both acquisitions with a movement observed in vivo of less than lmm. Registration can thus be limited to a 2D (video) - 3D (modeling) rigid registration of images. To perform an accurate registration it is essential to have reference landmarks that are visible in both images. In the case of neurosurgery or orthopedic surgery, fiducials (their number varies between 3 and 6) are usually fixed on bones in order to ensure their motionlessness. In the case of digestive
surgery, we place a greater number of fiducials on the skin, what allows to ensure greater stability and thus a more reliable registration (a total of 25 fiducials). These fiducials will automatically be modeled during skin segmentation and will automatically be segmented in the video image. In order to carry out this registration, we orient two tri-CDD digital color cameras conjointly calibrated on the model under two different points of view with an angle between 40" and 90" for an accurate stereoscopic registration [ 1I]. These cameras are connected to a personal computer thanks to a Matrox Meteor 11 acquisition card allowing the simultaneous acquisition of two video sources. In order to superimpose the 3D model in the video images, fiducials located on the 3D reconstruction have to be positioned on those visible in video images. To do so, we use a 3Di2D registration according to our method described in [ 1 11 : 3D points are spatial coordinates of fiducials reconstructed from data stemming from the scanner, and 2D points are pixel coordinates corresponding to fiducials in both camera views. The experimental study [ l 13 we carried out with an abdominal model showed that superimposition accuracy for targets located inside the model reaches an average of 2mm after fiducial registration. Thus, we developed an efficient registration method, which allows to superimpose the 3D modeling of the patient on the real-time acquisition realized in the OR, thus providing a virtual view in transparency of the patient. For this transparency visualization to be useful, it has to be coupled with a visualization of tools that will be inserted inside the body, so as to make them visible during the whole intervention, and thus allowing to realize a precise targeting of the aimed tumor. Therefore, we developed a real-time tracking system of surgical tools allowing to superimpose a virtual instrument on the real instrument (figure 3). To do so, we have adapted the ARTooKit library [I21 so that it works with the Meteor I1 acquisition card. Thus, tracking of the instrument is eased by adding a printed planar square, which is automatically recognized by the library, and permanent visualization of tool positioning is also possible.
482
Figure 3 : Computer-assisted targeting thanks to the augmented reality vision.
In order to validate the interest of such a system, we have realized a targeting which is similar to the one carried out during a radiofrequency ablation intervention. We modeled targets with radio-opaque fiducials stuck on the synthetic liver inside a model (cf. figure 3). To reach targets, the augmented view provides the virtual position of target and needle on both video images. Needle orientation is guided by the color of the target, which changes when the needle points towards the right direction. Furthermore, the software indicates the distance (in mm) between virtual needle tip and virtual target in real-time. Two different users, software designer and expert surgeon, who did not participate in the software development, then carried out 50 targetings each. Targeting accuracy has been evaluated thanks to an endoscopic camera oriented towards target fiducials. Resulting mean distance and time to reach the target are indicated in Table 1. Table I :Targeting results (mean value f standard deviation).
Total Real mean distance (mm) Mean time (sec)
Computer Scientist 3.6 A 1.03
2.79 f 1.41 46.57 f 38.5 f 21.78 24.64
Surgeon
*
1.98 1.27 54.64 f 24.88
These results clearly show an important targeting accuracy, since the expert has a targeting error of less than 2mm. Furthermore, comparison between targeting carried out by the software designer and the expert surgeon allows us to check that the system ensures a very high realism in usage, by favoring the expertise of gesture and not the expertise of the developed system. This is indeed a frequently encountered problem in computer-assisted surgical gesture
483
systems, where knowledge of the system developed by the designer often enables himher to carry out more accurate actions than the medical expert. The other advantage of this system is the time required for needle positioning. In clinical routine, such a positioning takes 5 to 10 minutes, due to the use of intraoperative imaging, such as ultrasonography or scanner, that prolongs intervention duration. Our experience shows that the gesture requires on average less than one minute to be carried out.
5.
Conclusion
The various works carried out by our research teams led us to develop a set of tools for diagnosis support and surgical intervention planning. They also allow to simulate a complex procedure on a 3D virtual copy of the patient and to use preoperative information during intervention, in particular by superimposing the virtual image of internal organs and pathologies on the abdomen of the patient. These systems, at an experimental stage, are progressively being tested clinically, with the objective of eventually being used in clinical routine. They represent the first essential phase for surgical gesture automation, that will allow to reduce surgical mistakes. Indeed, intervention simulation will allow to do without all superfluous or imperfect gestures, using it as a programming of the final gesture. This gesture will then be transmitted to a surgical robot which, thanks to augmented reality, will be able to precisely reproduce optimized gestures of the surgeon. Tomorrow’s surgery is on its way.
Acknowledgments We would like to thank all our medical partners, who have provided us medical images of patients, and in particular Pr. Afshin Gangi and Pr. Catherine Roy from the radiological department of the Civil Hospital in Strasbourg. We also thank the surgical team of Professor Marescaux for their help, medical analysis and validation of our works, especially Pr. Mutter, Pr. Leroy, Dr. Vix, Dr. Henry and Dr. Garcia. Finally, we are very grateful to the Virtual-Surg team of IRCAD and the Epidaure Team of INRIA for their great contribution to these works, and in particular S. Cotin, F. Doganis, C. Haessig, C. Koehl, J . Moreau, A.-€3. Osswald, N. Papier, and G. Picinbono.
References 1. J. Montagnat and H. Delingette : “Volumetric Medical Images Segmentation using Shape Constrained Deformable Models”. CVRMedMRCAS‘97, Springer Verlag Publisher, LNCS 1205, pp. 13-22, (1997).
484
2.
L. Gao, D.G. Heath, B.S. Kuszyk and E.K. Fishman : “Automatic Liver Segmentation Techniques for Three-Dimensional Visualization of CT Data”. Radiology, 2(201), pp. 359-364, (1996). 3. K.T. Bae, M.L. Giger, C.-T. Chen and C.E. Kahn : “Automatic segmentation of liver structure in CT images”. Medical Ph-vsics, 1(20), pp. 71-78, (1993). 4. A. Schenk, G. Prause and H.-0. Peitgen : “Efficient SemiAutomatic Segmentation of 3D objects in Medical Images”. MICCAI 2000, Springer Verlag Publisher, LNCS 1935, pp. 186-195, (2000). 5 . G. Glombitza, W. Lamade, A. M. Demeris, M.-R. Gopfert, A. Mayer, M. L. Bahner, H.-P. Meinzer, G. Richter, T. Lehnert and C. Herfarth : “Virtual planning of liver resections: image processing, visualisation and volumetric evaluation”. International Journal of Medical Informatics, Elsevier, vol. 53, pp. 225-237, (1999). 6. L. Soler, H. Delingette, G. Malandain, J. Montagnat, N. Ayache, C. Koehl, 0. Dourthe, B. Malassagne, M. Smith, D. Mutter, J. Marescaux : “Fully automatic anatomical, pathological, and functional segmentation from CT scans for hepatic surgery”. Computer Aided Surgery, Vol. 6, Num. 3, pp. 131-142, (2001). 7. C. Koehl, L. Soler, J. Marescaux: “A PACS Based Interface for 3D Anatomical Structures Visualization and Surgical Planning.”, SPIE proceeding Vol. 4681, pp. 17-24, San diego USA, (2002). 8. N. Ayache : “Epidaure: a Research Project in Medical Image Analysis, Simulation and Robotics at INRIA”, IEEE Transaction on Medical Imaging, October 2003,22(10), pp. 1185-1201 (2003). 9. S. Cotin, H. Delingette, and N. Ayache : “A Hybrid Elastic Model allowing Real-Time Cutting, Deformations and Force-Feedback for Surgery Training and Simulation.”, The Visual Computer, 16(8), pp. 437-452, (2000). 10 C. Forest, H. Delingette, N. Ayache : “Cutting Simulation of Manifold Volumetric Meshes.”, in Medical Image Computing and Computer-Assisted Intervention (MICCAI ’02),LNCS 2488, pp. 235-284, (2002). 11. S. Nicolau, X. Pennec, L. Soler and N. Ayache : “Evaluation of a new 3Dl2D registration criterion for Liver Radio-frequencies Guided by Augmented reality.”, IS4TM2003, LNCS 2673, pp.270-283 (2003). 12. M. Billinghurst, H.Kato, M. Billinghurst, I. Poupyrev (2001) : “The MagicBook: Moving Seamlessly between Reality and Virtuality. ”, In IEEE Computer Graphics and Applications, MayIJune, 2001, pp.2-4 (2001).
FUNCTIONAL MAPPING OF THE CORTEX - SURFACE BASED VISUALIZATION OF FUNCTIONAL MRI RENE KRISHNAN, ANDREAS RAABE, MICHAEL ZIMMERMANN AND VOLKER SEIFERT Department ofNeurosurgery, Johann Wolfgang Goethe-University, Schleusenweg 2- 16, 60528 Frankjurt/Main, Germany
Thc intcrindividual variability of thc complcxity of cortical gcography oftcn impcdcs studics of functional specialization. Thc display of functional data in thc stcrcotactic space may lead to underestimation of thc distancc of activatcd arcas, as two point closc to cach other in stcrcotactic space can be distant in Euclidian space. In this study wc discuss compcnsation strategies, involving explicit rcconstruction and display of cortical surfaccs (such as 3D nativc configuration, 2D slices, cxtcnsivcly smoothcd surfaccs, sphcrical rcprcscntations and cortical flat maps). Functional MRI studics of 5 paticnts during standardizcd paradigms for hand-, foot-, and tonguc movcmcnts wcrc pcrformcd and EPI T2* weighted imagcs wcrc acquired. Data analysis was done with the Brain Voyagcr softwarc (Brain Innovation, Maastricht, NL) on a standard PC. Statistical maps for the time course of activatcd voxcls and corrcsponding 3D surfaccs wcrc rcndcrcd. The functional motor arcas wcrc clcarly visiblc in all reconstructions. In nativc 3D reconstructions activation within the sulci was not always fully visible, whcrcas these areas can clcarly be sccn on the morphcd surfaccs. Inflatcd models and flat maps werc supcrior in demonstrating the spatial cxtcnt of an activatcd area. For intcrindividual comparison data mapping on wcll dcfincd gcomctrical objects is an altcrnative solution. Surface based warping allows functional data to bc displaycd whilc rcspccting surface topology in contrast to slicc representations. Surface based morphing of functional and anatomical data is new pcrspcctivc in visualization and 3D rcconstruction of functional MRI data.
1.
Introduction
The cerebral cortex, which has the topology of a 2-D sheet with a complex and highly folded geometry, is composed of a mosaic of a large number of different specialized areas. Some of these areas can be visualized, task specific, by means of functional MRI neuroimaging studies using the BOLD contrast. Common approaches for analyzing functional data include the display of activations on orthogonal slices of a 3-D volume or volume renderings. This technique has major drawbacks, as the sulcal regions are occluded from view, image dimensions are distorted when the surface is oblique to the viewing angle and the topology of the cortex is not respected by the stereotactic coordinate system - points that are close together in 3-D space may be widely separated along the cortical surface. 485
486
To address these problems, surfaces can be modified in several ways, such as smoothing the surface (inflation), cutting it at suitable locations (flat maps) and mapping it to geometrically well defined shapes, to establish a paramerizable 2-D coordinate system, which allows for intersubject comparisons. In this study we have applied these visualisation strategies to the functional data sets of five patients, selected for surgery, with lesions close to the central region. The surface display formats were evaluated with respect to visibility, topology, distortion, parameterisation and localization, relative to the native configuration. 2.
Patients and Methods
2.1. Functional MRI - Data Acquisition
The data sets of five patients (3 female, 2 male; age 24-58 years) with lesions close to the central region were transformed and examined. Functional magnetic resonance imaging was performed with a 1,5 Tesla Siemens Magnetom Vision scanner (Siemens AG, Erlangen, Germany) using the echo-planar T2* imaging technique (EPI; 15 slices, slice thickness 6mm, slice distance 0,6mm, TE 66ms, TR 4sec, matrix 128x128, FOV 230mm, 120 measurements, 3 dummy measurements). The paradigm used consisted of 8 periods of alternating tongue movements along the teeth as well as movements of the hand (finger tapping) and foot (toe flexing) contra-lateral to the affected side. No resting condition was implemented. During each examination a T1 weighted 3D MP-RAGE data set was acquired as anatomical reference scan. The voxel size of these images was isotropic with lxlxlmm. 2.2. High resolution anatomical data
3-D magnitude preparation-rapid acquisition gradient echo sequence (MP RAGE) after gadolinium (TR 9,7 ms; TE 4 ms; flip angle 12 deg.; slab 170 mm; 170 slices; slice thickness 1 mm; FOV 256 mm, matrix 256 x 256) with isometric voxel sizes of l x l x l mm, co registered with f-MRI images
-
2.3. Functional MRI Data Analysis We used the Brainvoyager 2000 v 4.9 software package (Brain Innovation, Maastricht, NL), which provides fast and optimized 2D and 3D analysis and visualization routines, as well as advanced volume-based statistical analysis methods. Volume and surface rendering are integrated software features,
487
including methods for automatic brain segmentation, surface reconstruction, cortex inflation and flattening. The program is running on a Pentium 111 PC with 2,4 GHz, 1 GB RAM, 80 GB HD under Windows 2000.
3.
Results
3.1. Native 3-0 Reconstruction The automatic segmentation module executes several steps to segment the cortex along the white matter and grey matter boundary, involving edge preserving smoothing, filling of the ventricles, creation and analysis of intensity histograms to detect grey and white matter peaks, region growing and other morphological operations. Finally the hemispheres are disconnected and “bridges” between the sulci are removed. After this segmentation a 3-D model of the surface geography was obtained for all patients. The intrinsic curvatures were shaded and smoothed - darker colour represents cortex that is buried within the sulci. The functional data are represented in a natural and familiar way; the lesion may be segmented and included in the reconstruction.
3.2. Extensively Smoothed Surfaces (Cortex Inflation) Cortex inflation uses two forces, corrective smoothing and the length distortion force, which minimizes geometric distortions during inflation and unfolding. To apply these forces a surface mesh of the folded cortex serves as a reference for all future morphed versions. To execute cortex inflation a smoothing force is applied several hundred times, resulting in a representation that retains much of the shape and metric properties of the original surface, but allows the visualization of functional activity occurring within the sulci. 3.3 Mapping the Surface to Pararneterizable Shapes
Identifying corresponding points on different cortical surfaces requires the establishment of a uniform surface based coordinate system (in contrast to volume based coordinate systems in which a point on the surface of one volume will typically not lie on the surface of a different volume). Mapping the cortex to a geometrically well defined surface offers the advantage to represent localisations by a two-dimensional coordinate system that respects the topology of the surface. Just as latitude and longitude for describing locations on the earth, surface based coordinates have advantages when describing locations on the cerebral cortex.
This approach offers significant advantages for interindividual comparison of functional data by morphing one parameterizable surface to an other (= surface based morphing). 3.3. Flattening Surfaces
To flatten a hemisphere with minimal distortions a number of cuts on the medial surface have to be performed - one around the corpus callosum to remove all midbrain structures and a set of equally spaced radial cuts. This preserves the topological structure of the lateral aspect of the surface. Flat maps are a very compact representation, as a hemisphere can bee seen in a single view without foreshortening, as it is planar. The inherent distortion can be partly corrected. A drawback is the interruption of surface topology. Two point close together on the surface may now be separated. 4.
Discussion
New visualization and analysis methods for f-MRI have been developed that emphasize surface representations. Surface based visualization includes methods for reconstructing surfaces and changing their shape to facilitate the analysis of geometrical and topological relationships. Basically 3-D- and 2-Dcoordinate based methods can be discerned (native reconstruction and inflated surfaces vv. spherical and flat maps). Native reconstructions have the advantage of familiarity and are a prerequisite for surgical planning and integration in neuronavigation systems. Accurate measurements in 3-D space (stereotactic) can be done in this representation. Buried cortex is brought into full view in the other three formats: extremely smoothed surfaces, sphericaVellipsoida1 maps and flat maps. The former two preserve topological relationships, but they require multiple views to see the entire hemisphere. Flat maps are the most compact representation, but entail the performance of selected cuts to prevent severe distortions. They are especially useful for displaying complex functional data sets, e.g chronology in event related f-MRI. The spherical deformed surface as a 2-D coordinate based representation of a hemisphere is a fundament of surface based warping, which is designed to bring one hemisphere into better register with another to analyse the consistency as well as variability of cortical function and its relation to to cortical geography. 4.1. Minimizing Distortions
A mapping between surfaces without metric distortion is called an isometry. Finding such a mapping from a sphere to a plane has been called the “mapmaker‘s problem” and was shown to be impossible by Gauss in 1828 as
489
these surfaces have different intrinsic (or Gaussian) curvatures. All surface representations that involve shape changes have distortions in surface areas and linear dimensions. To use the improved distortion correction algorithm, a completely unfoldedinflated representation of the cortical sheet has to be loaded. In addition, a folded reference mesh (native 3-D reconstruction) must be linked, which provides the desired correct local geometry. Although distortions are unavoidable, they are not a major methodological drawback in general.
3D-RECONSTRUCTION AND VISUALIZATION OF BONE MINERAL DENSITY FOR THE ETHMOID BONE CORNELIA KOBER Fac. of Engineering and Computer Science, Univ. of Appl. Sc. Osnabriick, Albrechtstr. 30, P.O. Box 1940, 0-49009 Osnabriick, Germany
ROBERT SADER Clinicfor Reconstructive Surgery, Univ. Hospital Basel, Spitalstr. 21, CH-4031 Basle, Switzerland
HANS-FLORIAN ZEILHOFER Clinicfor Reconstructive Surgery, Univ. Hospital Basel, Spitalstr. 21, CH-4031 Basle. Switzerland Second afiliation of all authors: Center of Advanced Studies in Cranio-Maxillo-Facial Surgery, Klinikum rechts der Isar, TU Munich, Ismaningerstr.22,D-81675 Munich, Germany
The concern of this article is a detailed 3D-analysis of the ethmoid bone based on CT data stemming from the clinical routine. By special preprocessing, a semiautomatic segmentation of the fine structures was possible. The subsequent surface reconstruction delivered a three dimensional representation of the anatomical details including the labyrinths, the nasal conchae, and the nasal septum. By a combination of computer graphics modules, a qualitative volumetric visualization of the bone mineral density could be given. A detail analysis revealed the refined inner structure of the nasal septum.
1.
Introduction
Three dimensional surface reconstructions of the individual geometry of the patient are more and more estimated for the purpose of surgery or treatment planning. Today, a contenting reconstruction of bony organs based on computer tomography (CT) data is possible by means of standard software as [ 1, 2, 81. But a reliable analysis of very fine and at the same time quite inhomogeneous structures is still an actual topic of research. In this context, our concern is the ethmoid bone, see Figure 1. For the anatomical background, we refer to [4, 7,9]. The ethmoid bone is situated at the antenor part of the base of the cranium, between the two orbits, at the roof of the nose, and contributes to each of these cavities. It is exceedingly light and spongy, and cubical in shape. It consists of four parts: a horizontal plate, forming part of 490
49 1
the base of the cranium; a perpendicular plate, constituting part of the nasal septum; and two lateral masses or labyrinths. The two lateral labyrinths consist of a number of thin-walled cellular cavities, the ethmoidal cells. The labyrinth or lateral mass (labyrinthus ethmoidalis) consists of a number of thin-walled cellular cavities, the ethmoidal cells, arranged in three groups, anterior, middle, and posterior, and interposed between two vertical plates of bone; the lateral plate forms part of the orbit, the medial, part of the corresponding nasal cavity.
Figure I . Sagittal cut through the ethmoid bone, lateral wall of the nasal cavity, see [4]
2.
Methods
The input of our analysis was a CT data set of a 45 year old female without diagnostic findings in the ethmoid bone but with overall reduced bone quality. The maxilla was toothless with three implants. The scan was performed with an image resolution of 0.4 mm2 pixel size and an axial slice thickness of 2 mm. Deliberately we chose this data set of reduced quality stemming from clinical routine. Though consisting of bone and cartilage tissue, the Hounsfield values of the walls of the ethmoid cells were even below those of soft tissue, see Figure 2 a ,b. Therefore, we manipulated the data by a combination of a 3D Lanczos filter, Gaussian smoothing, and the original data, see [6]. By this, thin bony structures were returned as dark lines of nearly homogeneous grey value, see Figure 3. Now, the anatomical details could be captured mainly by standard segmentation tools, so threshold segmentation, region growing, or active contours. In this way, we performed a segmentation of the scanned part of the skull, the implants, the
492
ethmoid, the sphenoidal, and the maxillary sinuses, the incisive canal, and the lesser and greater palatine foramina. Besides for the ethmoid cells, this kind of preprocessing has been successfully applied to the floor of the orbit which is often hardly visible in standard CT data.
Figure 2. Axial CT slice (a) with spline probe and additional plot of the Hounsfield values along the spline @). The plot starts at the right hand side of the spline.
Figure 3. Axial CT slice of the ethmoid cells after preprocessing.
After the segmentation, the next step was the three dimensional surface reconstruction of the whole model. Again, the complicated geometry and the low quality of the input data aggravated the procedure. In order to preserve the thin structures of less than 1 111111, two intermediate slices were interpolated between the segmentation parallel to the z-direction. By this, the dimensions of the single voxels became as homogeneous as possible. After this, we applied moderate Gaussian smoothing to the labelfield. Now, we applied the algorithm for generation of non-manifold surfaces provided by Amira [lo, l l]. The resulting model of about 2 millions of faces was very uneven and “stepwise”, but with all wanted details. The next steps were successive coarsening and smoothing to a surface model of about 100.000 faces. A simplification algorithm from computer graph-
493
ics has been provided within Amira for tlus purpose, specially avoiding intersections and assuring a high quality of the surface triangles [3]. The shape of the original surface is preserved by minimizing a certain error criterium. Additionally, by a combination of computer graphics modules a three dimensional volumetric visualization of bone mineral density could be given. By semiautomatic segmentation, significant ranges of voxels were grouped together. After ths, every voxel group was assigned a colour. Usually, we follow the physical colour scale, but we are not restricted to. A reasonable alternative is a stepwise grey scale. Next, we applied again the surface reconstruction algorithm provided by Amira [ 10, 111. Transparent rendering of the generated polygonal data set controllable via a transparency factor delivered a qualitative profile of the inner consistency. In order not to exceed hard and soft ware capacities, we restricted our analysis on a cubic region containing the ethmoid bone. In t h s article, we chose a four step grey scale. According to the CT slices, the colour of the voxel group with the highest grey values was white. Then two intermediate grey levels follow. Finally, black indicates the voxel group with the lowest grey values. For more details, see [ 5 ] . The software requirements for an analysis of organs as the human skull are robust handling of large data sets, reliable surface reconstruction and brilliant visualization possibilities. For all programming and image processing steps, we used the visualization package Amira, version 3.0 and 3.1, see [l]. Concerning the hardware, main memory and the graphcs board are decisive.
Figure 4. 3D-model, 1: maxillary sinuses, 2: paranasal sinuses, 3: inferior conchae, 4: medial conchae, 5 : ethmoid cells.
494
3.
Results
We present a three dimensional reconstruction of the scanned part of the skull, including the ethmoid bone, especially the labyrinths, the nasal conchae, and the nasal septum. The skull, the implants, the ethmoid, the sphenoidal, and the maxillary (together with the paranasal) sinuses, the incisive canal, and the lesser and greater palatine foramina were independent geometrical entities, see Figure 4. For a comparison of an (independent) anatomical preparation and our surface reconstruction, see Figure 5 a, b.
Figure 5. View into the ethmoid cells and the nasal cavity, a: preparation, see [7], b: surface reconstruction. 1: sphenoid sinus, 2: ethmoid cells, 3, nasolacrimal duct, 4: nasal cavity, 5, nasal septum.
495
By the macroscopic profile based on the voxel grouping, the rather differentiated inner structure of this part of the skull came to the fore. The resulting model comprised 460.725 points resp. 1.059384 faces. Within the 3D-viewer of Amira 3.1, it could be just handled interactively. Because we dealt with CT data, we refer the grey values as related to (optical) tissue density. According to the four voxel groups, we can assign four kinds of tissue within this part of the skull. First, the inner walls of the labyrinths belong clearly to the group with lowest grey values. The lateral walls of the ethmoid bone, the so called laminae papyraceae, the conchae, and the nasal septum belong mainly to the voxel groups of intermediate grey values whereas the posterior surface of the ethmoid bone and the innermost part of the nasal septum are rendered with white colour. Nevertheless, the interpretability of the grey scale illustration is not comparable with a coloured representation based on the physical colour scale for instance.
Figure 6 . View from above of the macroscopic profile of the inner structure embedded into a transparent rendering of the whole model. The grey levels are aligned to the Hounsfield values of the CT data, ascending from black over two intermediate grey levels to white.
496
As an example of a detail analysis, we add an elaboration of the inner structure of the nasal septum, see Figure 7. Because of its perpendicular shape, it could be insulated by two sagittal cutting planes on its left resp. the right hand side, see the small model in Figure 7. In spite of its width of only few pixels, it was possible to recognize some structure. Roughly spoken, there are two main regions of reduced density which are located in the upper and the lower part of the nasal septum. Further on, the density profile seems to be adapted to the special shape of the nasal septum. Maybe, the oval patterns suggest an influence of the nasal air flow.
Figure 7. Detail reconstruction of the nasal septum, the small model below indicates the position. The grey levels are aligned to the Hounsfield values of the CT data, ascending from black over two intermediate grey levels to white.
4.
Discussion and outlook
The presented surface reconstructions and visualizations give suggestive impressions of the geometry and the inner structure of the respective part of the organ under consideration, for instance the nasal septum or the conchae. The approach turned out to be applicable to data sets stemming from the clinical routine. Due to the preparatory image processing of the input data, the segmentation effort was acceptable. Therefore, the next step will be the application to pathological cases. Further, we try to contribute to the validation of the reconstruction of various thin structures. A special concern of the group is the analysis of tumours in this area or the minimal invasive access to the skull base via the ethmoid bone.
497
Acknowledgment The first author wants to thank H.-C. Hege and his team at ZIB Berlin for providing her with an Amira license. All reported visualizations are performed with Amira [l].
References 1. Amira - Advanced Visualization and Volume Modeling, User's Guide and Reference Manual, www.amiravis.de. 2. AnalyzeDirect, Comprehensive Visualization for Biomedical Imaging, www.analyzedirect.com. 3. M. Garland and P. S. Heckbert, SIGRAPH '97,209 (1997). 4. H. Gray, Anatomy ofthe Human Body, www.bartleby.codl07/ (1918). 5. C. Kober, R. Sader, and H.-F. Zeilhofer,.CARS2003, 1257 (2003). 6. C. Kober, Semiautomatic segmentation supported by hzgt-filters, manuscript, in preparation. 7. R.M.H. McMinn, R.T. Hutchings, and B.M. Logan, Anatomie der Kopfund Halsregion, Verlag zahnarzt1.-medizin. Schnfttum, Munich (1998). 8. Mimics, Medical Software, www.materialise.be/mimics. 9. R.Putz and R. Pabst, Sobotta Atlas of Human Anatomy, Urban & Schwarzenberg, Munich, Vienna, Baltimore (1994). 10. D. Stalling, M. Zockler, and H.-C. Hege, CAR'98, 137 (1998). 11. D. Stalling, M. Seebass, and S. Zachow: TR 98-05, ZIB Berlin (1998).
This page intentionally left blank
LIST OF AUTHORS Arenas Sanchez, M. M. LSIIT, Louis Pasteur University, Pale API, Boulevard Stbastien Brant, BP 10413, F-67412 Illkirch Cedex, France Institute for Diagnostic Imaging, Helios Klinikum Aschenbach, R. Erfurt Nordhauser Stralje 74, D-99089 Erfurt, Germany EPIDAURE Project, INRIA, 2004 Route des Ayache, N. Lucioles, F-06902 Sophia Antipolis Cedex, France Lehrstuhl fur Automation, Universitat Mannheim, Badreddin, E. B6,23-29 C, D-68 131 Mannheim, Germany Interdisciplinary Stereotactic Intervention- and Bale R. J. Planning Laboratory (SIP-Lab), Department of Radiology 1,University Innsbruck, Anichstr. 35, A6020 Innsbruck, Austria LSIIT, Parc d’lnnovation, Boulevard SCbastien Barbe, L. Brant, BP 10413, F-674 12 Illkirch Cedex, France Trauma Department, Hannover Medical School, Bastian, L. Carl-Neuberg-Str. 1, D-30625 Hannover, Germany Clinic for Nuclear Medicine / PET Centre, Baum, R. P. Zentraklinik Bad Berka, Robert-Koch-Allee 9, D99437 Bad Berka, Germany University of Applied Sciences Regensburg, Baumann, M. Priifeninger Strarje 58, D-93049 Regensburg, Germany LSIIT, Parc d’lnnovation, Boulevard Sebastien Bayle, B. Brant, BP 10413, F-67412 Illkirch Cedex, France Chair for Optics, Institute for Information und Benz, M. Photonics, University Erlangen-Nuremberg, Staudtstr. 7B2, D-91058 Erlangen, Germany Department of Neurosurgery, University Medical Berkelbach, J. W. Center, Heidelberglaan 100, NL-3584 CX, Utrecht, the Netherlands TIMC-IMAG, Institut d’hgenierie de 1’Information Berkelmann, P. de SantC, F-38706 La Tronche Cedex, France Klinik und Poliklinik fiir Strahlentherapie, Berlinger, K. Universitat Wiirzburg, Josef-Schneider-Str. 11, D97080 Wiirzburg, Germany
499
500
Berti, G. Beth, Th.
Bock, W. S.
Boesecke, R.
Boesnach, I.
Boidard, E. Bongartz, J.
Bonneau, E.
Borgert, J.
Bose, H.
Bourquain, H.
Branzan Albu, A.
Bruhns, 0.T. Buss, M.
C&C Research Laboratories, NEC Europe Ltd., Rathausallee 10, D-53757 Sankt Augustin, Germany Universitat Karlsruhe, Institut fur Algorithmen und Kognitive Systeme, Am Fasanengarten 5, D-76 131 Karlsruhe, Germany General Surgery Department, Tan Tock Seng Hospital, 11 Jalan Tan Tock Seng, Singapore 308433 MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany Universitat Karlsruhe, Institut f i r Algorithmen und Kognitive Systeme, Am Fasanengarten 5, D-7613 1 Karlsruhe, Germany TIMC-IMAG, Institut d'hgenierie de l'lnformation de SantC, F-38706 La Tronche Cedex, France Department of Mathematics and Technology, RheinAhrCampus Remagen, Sudallee 2, D-5 3424 Remagen, Germany CEA-List, Service Robotique et Systkmes Interactifs, Centre de Fontenay-aux-Roses, BP 6 92265 Fontenay-aux-Roses Cedex, France Philips Research Laboratories, Division Technical Systems, Rontgenstrasse 24-26, D-223 15 Hamburg, Germany University of Applied Sciences Regensburg, Priifeninger Stralje 58, D-93049 Regensburg, Germany MeVis - Centrum fur Medizinische Diagnosesysteme und Visualisierung, Universitatsallee 29, D-28359 Bremen, Germany Computer Vision and Systems Laboratory, Dept. of Electrical and Computer Engineering, Lava1 University, Cite Universitaire, Sainte-Foy, (QC), G 1K 7P4, Canada Institute of Mechanical Engineering, RuhrUniversity Bochum, D-44780 Bochum, Germany Munich University of Technology, Institute of Automatic Control Engineering, D-80290 Munich, Germany
501 Buzug, T. M.
Citak, M. Dammann, F.
Decker, M.
Delingette, H. Dillmann, R.
D0etter.M.
Doignon, C.
Dold, C.
Dombre, E.
Drexl, J.
Egersdorfer, S. Eggers, G.
Department of Mathematics and Technology, RheinAhrCampus Remagen, Sudallee 2, D-53424 Remagen, Germany Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany Abteilung fur Radiologische Diagnostik, Radiologische Universitatsklinik, Universitatsklinikum,Eberhard-Karls-Universitat Tubingen, Hoppe-Seyler-Strasse 3, D-72076 Tubingen, Germany lnstitut fur Technikfolgenabschatzung und Systemanalyse (ITAS), Forschungszentrum Karlsruhe, Postfach 3640, D-7602 1 Karlsruhe, Germany EPIDAURE Project, INRIA, 2004 Route des Lucioles, F-06902 Sophia Antipolis Cedex, France Institute for Process Control and Robotics (IPR), University of Karlsruhe (TH), Building 40.28, Engler-Bunte-Ring 8, D-76128 Karlsruhe, Germany Klinik und Poliklinik fur Strahlentherapie, Universitat Wurzburg, Josef-Schneider-Str. 11, D97080 Wurzburg, Germany LSIIT, Equipe Automatique Vision et Robotique, Louis Pasteur University, Pale API, Boulevard Sibastien Brant, BP 10413, F-67412 Illkirch Cedex, France Institute for Computer Graphics, Fraunhofer Gesellschaft, Fraunhoferstr. 5, D-64283 Darmstadt, Germany LIRMh4 - UMR 5506 CNRS / Universiti Montpellier 11, 161 rue Ada, F-34392 Montpellier Cedex 5 , France Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany Fraunhofer Institute for Silicate Research, Neunerplatz 2, D-97082 Wurzburg, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany
502
Engel, D.
Ermert, H. Ernemann, U.
Esen, H.
Esteve, D. Ewers, R.
Fangmann, J.
Fiegele, T.
Fingber, J.
Forest, C. Forgione, A.
Freimuth, H.
Frericks, B.
Freund, E.
Institute for Process Control and Robotics (IPR), University of Karlsruhe (TH), Building 40.28, Engler-Bunte-Ring 8, D-76 128 Karlsruhe, Germany Institute of High Frequency Engineering, RuhrUniversity Bochum, D-44780 Bochum, Germany Abteilung fiir Radiologische Diagnostik, Radiologische Universitatsklinik, Universitatsklinikum, Eberhard-Karls-Universitat Tubingen, Hoppe-Seyler-Strasse 3, D-72076 Tubingen, Germany Munich University of Technology, Institute of Automatic Control Engineering, D-80290 Munich, Germany LAASICNRS, 7, avenue du Colonel Roche, F3 1077 Toulouse, France University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical School, University of Vienna, Waehringer Guertel 18-20, A- 1090 Vienna, Austria Chirurgische Klinik 11, Klinik fiir Abdominal-, Transplantations- und GefaSchirurgie, Universitat Leipzig, Liebigstr. 20a, D-04103 Leipzig, Germany Leopold-Franzens-Universimt Lnnsbruck, Universimtsklinik fiir Neurochirurgie, AnichstraSe 35, A-6020 Innsbruck, Austria C&C Research Laboratories, NEC Europe Ltd., Rathausallee 10, D-53757 Sankt Augustin, Germany IRCADIEITSNIRTUALIS, 1, Place de l’H8pita1, F6709 1 Strasbourg Cedex, France LSIIT, Louis Pasteur University, P61e API, Boulevard SCbastien Brant, BP 10413, F-67412 Illkirch Cedex, France Institute for Micro Technology Mainz GmbH, CarlZeiss-Strasse 18-20, D-55 129 Mainz, Germany Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany Institute of Robotics Research (IRF), University of Dortmund, Otto-Hahn-Str. 8, D-44227 Dortmund, Germany
503 Frey, S. Gangi, A. Gangloff, J. A.
Geerling, J. Chanai, S.
Giel, D. Ginhoux, R.
Gosling, M.
Gravez. P.
Grewer, R.
Gross, I.
Hahn, M.
Harnm, K.
Handels, H.
Forschungszentrum caesar, Ludwig-Erhard-Allee 2, D-53175 BOM,Germany HBpital Civil, 1, place de l'H6pita1, BP 426, F-67091 Strasbourg Cedex, France LSIIT, Louis Pasteur University, P61e API, Boulevard Skbastien Brant, BP 10413, F-67412 Illkirch Cedex, France Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany Forschungszentrum caesar, Ludwig-Erhard-Allee 2, D-53 175 Bonn, Germany LSIIT, Louis Pasteur University, P61e API, Boulevard Stbastien Brant, BP 10413, F-67412 Illkirch Cedex, France UnfallchirurgischeKlinik, Medizinische Hochschule Hannover (MHH), Carl-Neuberg-Str. 1, D-30625 Hannover, Germany CEA-List, Service Robotique et Systemes Interactifs, Centre de Fontenay-aux-Roses, BP 6 92265 Fontenay-aux-Roses Cedex, France Philips Research Laboratories, Division Technical Systems, Rontgenstrasse 24-26, D-223 15 Hamburg, Germany University of Siegen, Institute of Automatic Control Engineering, ZESS - Centrurn for Sensor-Systems, Paul-Bonatz-Str. 9-1 1, D-57068 Siegen, Germany Universitat Karlsruhe, Institut fb Algorithmen und Kognitive Systeme, Am Fasanengarten 5, D-76 131 Karlsruhe, Germany Dept. for Stereotactic Neurosurgery and Radiosurgery, Helios Klinikum Erfurt Nordhauser StraBe 74, D-99089 E f i r t , Germany University Hospital Hamburg-Eppendorf, Institute of Medical Informatics, House S 14, Martinistr. 52, D-20246 Hamburg, Germany
504
Harms, J.
Hartmann, U.
Hassfeld, S.
Hattingen, E.
Hausler, G .
Hauss, J.
Hayashi, M.
Heiland, M.
Heinze, F.
Hellier. P.
Henrich, D.
Hering, P. Herrmann, E.
Chirurgische Klinik 11, Klinik fur Abdominal-, Transplantations- und GefaBchirurgie, Universitat Leipzig, Liebigstr. 20a, D-04103 Leipzig, Germany Department of Mathematics and Technology, RheinAhrCampus Remagen, Sudallee 2, D-53424 Remagen, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany Johann Wolfgang Goethe University, Institute of Neuroradiology, Schleusenweg 2- 16, D-60528 FrankhrtIMain, Germany Chair for Optics, Institute for Information und Photonics, University Erlangen-Nuremberg, Staudtstr. 7/B2, D-9 1058 Erlangen, Germany Chirurgische Klinik 11, Klinik fur Abdominal-, Transplantations- und GefaDchirurgie, Universitat Leipzig, Liebigstr. 20a, D-04 103 Leipzig, Germany Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine, Tokyo Women's Medical University, 8- 1 Kawada-cho Shinjuku-ku, Tokyo 162-8666, Japan University Hospital Hamburg-Eppendorf, Klinik und Poliklinik fur Zahn-, Mund-, Kiefer- und Gesichtschirurgie (Nordwestdeutsche Kieferklinik), Martinistr. 52, D-20246 Hamburg, Germany Institute of Robotics Research (IRF), University of Dortmund, Otto-Hahn-Str. 8, D-44227 Dortmund, Germany Projet Vista, IEUSA/INRIA-CNRS, Campus Universitaire de Beaulieu, F-35042 Rennes Cedex, France Lehrstuhl fur Angewandte Informatik I11 (Robotik und Eingebettete Systeme), Universitat Bayreuth, D-95440 Bayreuth, Germany Holography and Laser Technology, Stiftung caesar, Ludwig-Erhard-Allee 2, D-53 175 Bonn, Germany Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2- 16, D-60528 Frankfurt/Main, Germany
505 Hertzberg, J.
Hierl, T.
Hodgson, A. J.
Hoffmann, J.
Hohne, K. H.
Holz, D.
Hiifner, T.
Iseki, H.
Ivanenko, M. Kahn, T.
Kamucha, G.
Kassim, 1. Kendoff, D.
Fraunhofer Institute for Autonomous Intelligent Systems (AIS), Schloss Birlinghoven, D-53754 Sankt Augustin, Germany Klinik und Poliklinik fur Mund-, Kiefer- und Plastische Gesichtschirurgie, Universitatsklinikum Leipzig, Nurnberger Str. 57, D-04103 Leipzig, Germany Neuromotor Control Laboratory, Department of Mechanical Engineering, University of British Columbia, 2324 Main Mall, Vancouver, B.C., Canada V6T 124 Klinik und Poliklinik fur Mund-, Kiefer- und Gesichtschirurgie, Universitatsklinikum Tubingen, Osianderstrasse 2-8, D-72076 Tubingen, Germany University Hospital Hamburg-Eppendorf, Institute of Medical Informatics, House S 14, Martinistr. 52, D-20246 Hamburg, Germany Department of Mathematics and Technology, RheinAhrCampus Remagen, Sudallee 2, D-53424 Remagen, Germany UnfallchirurgischeKlinik, Medizinische Hochschule Hannover (MHH), Carl-Neuberg-Str. 1, D-30625 Hannover, Germany Tokyo Women's Medical University, 8-1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan Faculty of Advanced Techno-Surgery (FATS), Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine Holography and Laser Technology, Stiftung caesar, Ludwig-Erhard-Allee 2, D-53 175 Bonn, Germany Klinik fiir Diagnostische und Interventionelle Radiologie, Universitat Leipzig, Liebigstr. 20a, D-04 103 Leipzig, Germany Deptartment of Electrical & Communications Engineering, Moi University, P.O. Box 3900, Eldoret, Kenya Clinical Research Unit, Tan Tock Seng Hospital, 11 Jalan Tan Tock Seng, Singapore 308433 Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany
506
Kfuri, M., Jr. Khaled, W. Kleinert, G.
Knappe, P.
Knappe, V.
Knoop, H.
Kober, C. Kompa, G.
Kopfle, A. Korb, W.
Kotrikova, B.
Krettek, C. Krishnan, R.
Kriiger, S.
Alexander-von-Humboldt-Stiftung,Ribeirao Preto Medical School, Sao Paulo, Brasilien Institute of High Frequency Engineering, RuhrUniversity Bochum, D-44780 Bochum, Germany Dept. for Stereotactic Neurosurgery and Radiosurgery, Helios Klinikum Erfurt Nordhauser Stral3e 74, D-99089 Erhrt, Germany University of Siegen, Institute of Automatic Control Engineering, ZESS - Centrum for Sensor-Systems, Paul-Bonatz-Str. 9-1 1, D-57068 Siegen, Germany Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany Universitat Karlsruhe (TH), Institut f i r Prozessrechentechnik, Automation und Robotik (IPR), Gebaude 40.28, Engler-Bunte-Ring 8, D76 131 Karlsruhe, Germany Univ. of Appl. Sc. Osnabriick, Albrechtstr. 30, P.O. Box 19 40, D-49009 Osnabriick, Germany Department of High Frequency Engineering (HFT), Kassel University, Wilhelmshoher Allee 73, D34 121 Kassel, Germany Lehrstuhl fiir Informatik V, Universitat Mannheim, B6,23-29 C, D-68161 Mannheim, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany MKG-Chirurgie, RupMKG-Chirurgie, RuprechtKarls-University Heidelberg, Im Neuenheimer Feld 400, D-69 120 Heidelberg, Germanyrecht-KarlsUniversity Heidelberg, Im Neuenheimer Feld 400, D-69 120 Heidelberg, Germany Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2- 16, D-60528 F r a n k m a i n , Germany Philips Research Laboratories, Division Technical Systems, Rontgenstrasse 24-26, D-223 15 Hamburg, Germany
507 Kriiger, T.
Kiibler, A.
Kubo, 0.
Lamy, S.
Laurendeau, D.
Lehmann, K.S.
Leimnei, A.
Leroy, J.
Letteboer, M.
Liehr, F. Lingemann, K.
Lorenz, A. Lueth, T.
Department of Maxillofacial Surgery, Clinical Navigation and Robotics, Medical Faculty Charite, Humboldt University at. Berlin, D-13353 Berlin, Germany Department of Cranio-Maxillofacial and Plastic Surgery, University of Cologne, Kerpener Str. 62, D-50937 Koln, Germany Tokyo Women's Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Department of Neurosurgery CEA-List, Service Robotique et Systkmes Interactifs, Centre de Fontenay-aux-Roses, BP 6, 92265 Fontenay-aux-Roses Cedex, France Computer Vision and Systems Laboratory, Dept. of Electrical and Computer Engineering, Lava1 University, Citt Universitaire, Sainte-Foy, (QC), G 1K 7P4, Canada Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany Laboratoire TIMC, Equipe GMCAO, Institut d'IngCnierie de I'lnformation de SantC, FacultC de MCdecine, F-38706 La Tronche Cedex, France Institut de Recherche sur les Cancers de 1'Appareil Digestif (IRCAD), Hapita1 Civil, BP 426, F-67091 Strasbourg Cedex, France Image Sciences Institute, University Medical Center, Heidelberglaan 100, NL-3584 CX, Utrecht, the Netherlands University of Duisburg-Essen, Lotharstrasse 65, D47048 Duisburg, Germany Fraunhofer Institute for Autonomous Intelligent Systems (AIS), Schloss Birlinghoven, D-53754 Sankt Augustin, Germany LP-IT Innovative Technologies GmbH, Huestr. 5 , D-44787 Bochum, Germany Department of Maxillofacial Surgery, Clinical Navigation and Robotics, Medical Faculty CharitC, Humboldt University at Berlin, D-13353 Berlin, Germany
508
Maassen, M. M.
Maier, T.
Maillet, P.
Malthan, D.
Manner, R. Marescaux, J.
Marmignon, C.
Marmulla, R.
Maruyama, T.
Mathelin de, M. F.
Maurin, B. Meer, F.van Michelin, M.
Mischkowski, R. A.
Department of Otolaryngology - Head & Neck Surgery, University of Tubingen, Elfriede-AulhornStraRe 5, D-72076 Tubingen, Germany Chair for Optics, Institute for Information und Photonics, University Erlangen-Nuremberg, Staudtstr. 7/B2, D-9 1058 Erlangen, Germany LIRMM UMR 5506 CNRS / Universite Montpellier 11, 161 rue Ada, F-34392 Montpellier Cedex 5, France Fraunhofer-Institute for Manufacturing Engineering and Automation IPA, NobelstraRe 12, 70569 Stuttgart, Germany Lehrstuhl f&Informatik V, Universitat Mannheim, B6,23-29 C, D-68161 Mannheim, Germany LSIIT, Louis Pasteur University, P61e API, Boulevard Sebastien Brant, BP 10413, F-67412 Illkirch Cedex, France Laboratoire TIMC, Equipe GMCAO, Institut d'hgenierie de l'hformation de Sante, Faculte de Medecine, F-38706 La Tronche Cedex, France MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69 120 Heidelberg, Germany Tokyo Women's Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Department of Neurosurgery LSIIT, Louis Pasteur University, P61e API, Boulevard Sibastien Brant, BP 10413, F-67412 Illkirch Cedex, France LSIIT, Parc &Innovation, Boulevard Sebastien Brant, BP 10413, F-67412 Illkirch Cedex, France LAASKNRS, 7, avenue du Colonel Roche, F3 1077 Toulouse, France LIRMM - UMR 5506 CNRS / Universite Montpellier 11, 161 rue Ada, F-34392 Montpellier Cedex 5, France Department of Cranio-Maxillofacial and Plastic Surgery, University of Cologne, Kerpener Str. 62, D-50937 Koln, Germany
509 Moisan, C. Moldenhauer J.
Monkman, G. Mijssinger, E. Miihling, J.
Miihling, J.
Muragaki, Y.
Mutter, D. Nageotte, F.
Nakamura, R.
Nambu, K.
Neukam, F. W.
Nicolau, S.
Dept. of Radiology, Lava1 University, CitC Universitaire, Sainte-Foy, (QC), G 1K 7P4, Canada Universitat Karlsruhe, Institut fur Algorithmen und Kognitive Systeme, Am Fasanengarten 5, D-76 131 Karlsruhe, Germany Fraunhofer Institute for Silicate Research, Neunerplatz 2, D-97082 Wurzburg, Germany Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany Tokyo Women’s Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan Faculty of Advanced Techno-Surgery (FATS), lnstitute of Advanced Biomedical Engineering & Science, Graduate School of Medicine IRCAD/EITS/VIRTUALIS, 1, Place de l’Hbpita1, F67091 Strasbourg Cedex, France LSIIT, Equipe Automatique Vision et Robotique, Louis Pasteur University, P81e API, Boulevard SBbastien Brant, BP 10413, F-67412 lllkirch Cedex, France Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine, Tokyo Women’s Medical University, 8- 1 Kawada-cho Shinjuku-ku, Tokyo 162-8666, Japan Tokyo Women’s Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Faculty of Advanced Techno-Surgery (FATS), Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine Department of Oral and Maxillofacial Surgery, University Erlangen-Nuremberg, Gluckstr. 11, D-9 1054 Erlangen, Germany IRCADIEITSNIRTUALIS, 1, Place de l’H8pita1, F6709 1 Strasbourg Cedex, France
510
Niesen, A.
Niessen, W.
Nishizawa, K.
Nkenke, E.
Nuchter, A.
Ohnsorge, J. A. K.
Oldhafer, K.
Oomori, S.
O'Sullivan, N.
Peitgen, H.-0.
Pennec, X. Pesavento, A. Petersik, A.
Clinic for Nuclear Medicine / PET Centre, Zentralklinik Bad Berka, Robert-Koch-Allee 9, D-99437 Bad Berka, Germany Image Sciences Institute, University Medical Center, Heidelberglaan 100, NL-3584 CX, Utrecht, the Netherlands Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine, Tokyo Women's Medical University, 8- 1 Kawada-cho Shinjuku-ku, Tokyo 162-8666, Japan Department of Oral and Maxillofacial Surgery, University Erlangen-Nuremberg,Gluckstr. 11, D-9 1054 Erlangen, Germany Fraunhofer Institute for Autonomous Intelligent Systems (AIS), Schloss Birlinghoven, D-53754 Sankt Augustin, Germany Rheinisch-Westfilische Technische Hochschule, RWTH Aachen, Orthopadische Universitatsklinik, UK Aachen, Pauwelsstrasse 30, D-52074 Aachen, Germany MeVis - Centrum &r Medizinische Diagnosesysteme und Visualisierung, Universitatsallee 29, D-28359 Bremen, Germany Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine, Tokyo Women's Medical University, 8-1 Kawada-cho Shinjuku-ku, Tokyo 162-8666, Japan MKG-Chirurgie, Ruprecht-Karls-University Heidelberg, Im Neuenheimer Feld 400, D-69120 Heidelberg, Germany Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany EPIDAURE Project, INRIA, 2004 Route des Lucioles, F-06902 Sophia Antipolis Cedex, France LP-IT Innovative Technologies GmbH, Huestr. 5, D-44787 Bochum, Germany University Hospital Hamburg-Eppendorf, Institute of Medical Informatics, House S 14, Martinistr. 52, D-20246 Hamburg, Germany
51 1
Pflesser, B.
Pieck, S.
Plaskos, C.
Pohl, Y.
Poignet, P.
Pott, P. P.
Prescher, A.
Preusser, T. Raabe, A.
Raczkowsky, J.
Rancourt, D.
Rautmann, M.
Reichling, S. Reinert, S.
University Hospital Hamburg-Eppendorf, Institute of Medical Informatics, House S 14, Martinistr. 52, D-20246 Hamburg, Germany University of Siegen, Institute of Automatic Control Engineering, ZESS - Centrum for Sensor-Systems, Paul-Bonatz-Str. 9-1 1, D-57068 Siegen, Germany Laboratoire TIMC-IMAG, Groupe GMCAO, FacultC de MCdecine de Grenoble, Joseph Fourier University, F-38706 La Tronche Cedex, France Poliklinik f b Chirurgische Zahn-, Mund- und Kieferheilkunde, University Dental Clinic Bonn, Welschnonnenstr. 17,53 1 11 Bonn, Germany LIRMM UMR 5506 CNRS / UniversitC Montpellier 11, 161 rue Ada, F-34392 Montpellier Cedex 5, France Labor fur Biomechanik und experimentelle Orthopadie, Orthopadische Universitatsklinik Mannheim, Theodor-Kutzer-Ufer 1-3; D-68 167 Mannheim, Germany Rheinisch-Westfalische Technische Hochschule, RWTH Aachen, lnstitut fur Anatomie, UK Aachen, Wendlingweg 2, D-52074 Aachen, Germany CeVis, University of Bremen, Universitatsallee 29, D-28359 Bremen, Germany Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2- 16, D-60528 Frankhrt/Main, Germany Institute for Process Control and Robotics (IPR), University of Karlsruhe (TH), Building 40.28, Engler-Bunte-Ring 8, D-76 128 Karlsruhe, Germany Dept. of Mechanical Engineering, Sherbrooke University, 2500 boul. de l'Universitt, Sherbrooke (QC), J1K 2R1, Canada Orthopadische Universitatsklinik Mannheim, Theodor-Kutzer-Ufer 1-3, D-68 167 Mannheim, Germany Institute of Mechanical Engineering, RuhrUniversity Bochum, D-44780 Bochum, Germany Klinik und Poliklinik fur Mund-, Kiefer- und Gesichtschimrgie, Universitatsklinikum Tubingen, Osianderstrasse 2-8, D-72076 Tubingen, Germany
51 2
Richter, M. Roflmann, J.
Roth, M.
Rueckert, D.
Rumpf, M. Sader, R.
Sakuma, I. Salb, T.
Sauer, 0.
Sauter, S. Scharf, H. P.
Schauer, D.
Schicho, K.
Schill, M.
Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany Institute of Robotics Research (IRF), University of Dortmund, Otto-Hahn-Str. 8, D-44227 Dortmund, Germany Klinik und Poliklinik fur Strahlentherapie, Universitat Wurzburg, Josef-Schneider-Str. 11, D97080 Wurzburg, Germany Department of Computing, Imperial College, South Kensington Campus, 180 Queen's Gate, SW7 2AZLondon, UK University of Duisburg-Essen, Lotharstrasse 65, D47048 Duisburg, Germany University Hospital Basel, Spitalstrasse 2 1, CH403 1 Basel, Switzerland, HFZ, TU Munich, Ismaningerstr. 22, D-81675 Munich, Germany The University of Tokyo,7-3- 1, Hongo, Bunkyouku, Tokyo 113-8654, Japan Institute for Process Control and Robotics (IPR), University of Karlsruhe (TH), Building 40.28, Engler-Bunte-Ring 8, D-76 128 Karlsruhe, Germany Klinik und Poliklinik f i r Strahlentherapie, Universitat Wiirzburg, Josef-Schneider-Str. 11, D-97080 Wiirzburg, Germany University of Zurich, Winterthurerstrasse 190, CH8057 Zurich, Switzerland Orthopadische Universitatsklinik Mannheim, Theodor-Kutzer-Ufer 1-3, D-68 167 Mannheim, Germany Department of Maxillofacial Surgery, Clinical Navigation and Robotics, Medical Faculty CharitC, Humboldt University at Berlin, D- 13353 Berlin, Germany University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical School, University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna, Austria VRmagic GmbH, B6,23-29 C, D-68032 Mannheim, Germany
513
Schkommodau, E.
Schmidt, J. G.
Schmitz, G.
Schmucking, M.
Schon, N.
Schwaderer, E.
Schwald, B.
Schwarz, M. L. R.
Schweikard, A.
Seemann, R.
Seifert, V.
Shin, Hoen-oh
Rheinisch-Westfalische Technische Hochschule, RWTH Aachen, lnstitut fur Biomedizinische Technologien, UK Aachen, Pauwelsstrasse 20, D52074 Aachen, Germany C&C Research Laboratories, NEC Europe Ltd., Rathausallee 10, D-53757 Sankt Augustin, Germany Department of Mathematics and Technology, RheinAhrCarnpus Remagen, Sudallee 2, D-53424 Remagen, Germany Clinic for Nuclear Medicine / PET Centre, Zentralklinik Bad Berka, Robert-Koch-Allee 9, D-99437 Bad Berka, Germany Chair for Optics, Institute for Information und Photonics, University Erlangen-Nuremberg, Staudtstr. 7/B2, D-91058 Erlangen, Germany Department of Diagnostic Radiology, University Hospital of Tubingen, Otfried-Muller-Str. 10, D-72076 Tubingen, Germany Computer Graphics Center, Department Visual Computing, Fraunhoferstr. 5, D-64283 Darmstadt, Germany Labor f i r Biomechanik und experimentelle Orthopadie, Orthopadische Universitatsklinik Mannheim, Theodor-Kutzer-Ufer 1-3, D-68 167 Mannheim, Germany Klinik und Poliklinik fur Strahlentherapie, Universitat Wurzburg, Josef-Schneider-Str. 11, D-97080 Wiirzburg, Germany University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical School, University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna, Austria Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2-1 6, D-60528 FrankfUrt/Main, Germany Center for Medical Diagnostic Systems and Visualization (MeVis) GmbH, Universitatsallee 29, D-28359 Bremen, Germany
514
Siebert, C. H.
Siessegger, M.
Sing, N. W.
Soler, L.
Spetzger, U. Stallkamp, J.
Stolka, P.
Sugiura, M.
Surber, G.
Surmann, H.
Suzukawa, K.
SzelCnyi, A.
Rheinisch-Westfalische Technische Hochschule, RWTH Aachen, Orthopadische Universitatsklinik, UK Aachen, Pauwelsstrasse 30, D-52074 Aachen, Germany Department of Cranio-Maxillofacial and Plastic Surgery, University of Cologne, Kerpener Str. 62, D-50937 Koln, Germany Computer Integrated Medical lntervetion Lab, Nanyang Technological University, 50 Nanyang Ave 6, Singapore 39798 LSIIT, Louis Pasteur University, P61e API, Boulevard Sdbastien Brant, BP 10413, F-674 12 Illkirch Cedex, France Neurochirurgische Klinik, Klinikum Karlsruhe, MoltkestraSe 90, D-76133 Karlsruhe, Germany Fraunhofer Institute for Manufacturing Engineering and Automation (IPA), Nobelstr. 12, 70569 Stuttgart, Germany Lehrstuhl fiir Angewandte Informatik 111 (Robotik und Eingebettete Systeme), Universitat Bayreuth, D-95440 Bayreuth, Germany Tokyo Women's Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan Faculty of Advanced Techno-Surgery (FATS), Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine Dept. for Stereotactic Neurosurgery and Radiosurgery, Helios Klinikum Erfiu-t Nordhauser Stralje 74, D-99089 Erfurt, Germany Fraunhofer Institute for Autonomous Intelligent Systems (AIS), Schloss Birlinghoven, D-53754 Sankt Augustin, Germany Tokyo Women's Medical University, 8- 1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Faculty of Advanced Techno-Surgery (FATS), Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine Johann Wolfgang Goethe University, Department of Neurosurgery, Schleusenweg 2-1 6, D-60528 Frankfurthlain, Germany
515
Taha, F. Takakura, K.
Thelen, A. Tiede, U.
Timminger, H.
Tomokatsu, H.
Troccaz, J.
Troitzsch, D.
Tunayar, A. Vences, L.
Vogt, F.
Wagner, A. Wagner, R.
Wahl, G .
CHU Amiens, Service de Chirurgie Maxillo-Faciale, CHU Nord, 80054 Amiens, France Tokyo Women's Medical University, 8-1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Faculty of Advanced Techno-Surgery (FATS), Institute of Advanced Biomedical Engineering & Science, Graduate School of Medicine Forschungszentrum caesar, Ludwig-Erhard-Allee 2, D-53175 Bonn, Germany University Hospital Hamburg-Eppendorf, Institute of Medical Informatics, House S 14, Martinistr. 52, D-20246 Hamburg, Germany Department of Measurement, Control, and Microtechnology, University Ulm, Albert-EinsteinAllee 4 1, D-8908 1 Ulm, Germany Tokyo Women's Medical University, 8-1 Kawadacho Shinjuku-ku, Tokyo 162-8666, Japan, Department of Neurosurgery TIMC-MAG, Institut d'IngCnierie de 1'Information de SantC, F-38706 La Tronche Cedex, France Klinik und Poliklinik fur Mund-, Kiefer- und Gesichtschirurgie, Universitatsklinikum Tubingen, Osianderstrasse 2-8, D-72076 Tubingen, Germany Institute for Micro Technology Mainz GmbH, CarlZeiss-Strasse 18-20, D-55 129 Mainz, Germany Klinik und Poliklinik &r Strahlentherapie, Universitat Wiirzburg, Josef-Schneider-Str. 11, D97080 Wiirzburg, Germany Chair for Pattern Recognition, University ErlangenNuremberg, Martensstr. 3, D-9 1058 Erlangen, Germany Lehrstuhl f i r Automation, Universitat Mannheim, B6,23-29 C , D-68 131 Mannheim, Germany University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical School, University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna, Austria Poliklinik fix Chirurgische Zahn-, Mund- und Kieferheilkunde, University Dental Clinic Bonn, Welschnonnenstr. 17,53 111 Bonn, Germany
516
Wahrburg, J.
Wechsler, A.
Weikard, U. Weiser, P. Westendorff, C.
Widmann, E. Widmann, G .
Widmann, R. Wildberger, J. E.
Willems, P.
Wirtz, D. C.
Wittwer, G .
Woessner, S.
Wolff, R.
University of Siegen, Institute of Automatic Control Engineering, ZESS - Centrum for Sensor-Systems, Paul-Bonatz-Str. 9-1 1, D-57068 Siegen, Germany Fraunhofer Institute for Manufacturing Engineering and Automation (IPA), Nobelstr. 12, 70569 Stuttgart, Germany University of Duisburg-Essen, Lotharstrasse 65, D47048 Duisburg, Germany Institut fur CAE, Fachhochschule Mannheim, Windeckstr. 1 10, D-68 163 Mannheim, Germany Klinik und Poliklinik fur Mund-, Kiefer- und Gesichtschirurgie, Universitatsklinikum Tubingen, Osianderstrasse 2-8, D-72076 Tubingen, Germany Ordination Dr. Widmann, Unterauweg 7a, A-6280 Zell/Ziller, Austria Interdisciplinary Stereotactic Intervention- and Planning Laboratory (SIP-Lab), Department of Radiology 1,University Innsbruck, Anichstr. 35, A6020 Innsbruck, Austria Zahntechnisches Labor Czech, Kreuzstr. 20, A-6067 Absam, Austria Rheinisch-Westfalische Technische Hochschule, RWTH Aachen, Klinik f~ Radioiogische Diagnostik, UK Aachen, Pauwelsstrasse 30, D52074 Aachen, Germany Department of Neurosurgery, University Medical Center, Heidelberglaan 100, NL-3584 CX, Utrecht, the Netherlands Rheinisch-Westfalische Technische Hochschule, RWTH Aachen, Orthopadische Universitatsklinik, UK Aachen, Pauwelsstrasse 30, D-52074 Aachen, Germany University Hospital of Cranio-Maxillofacial and Oral Surgery, Medical School, University of Vienna, Waehringer Guertel 18-20, A- 1090 Vienna, Austria Fraunhofer Institute for Manufacturing Engineering and Automation (IPA), Nobelstr. 12, 70569 Stuttgart, Germany Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2- 16, D-60528 Frankfurt/Main, Germany
517
Worn, H.
Wu, R. Wyslucha, U. Yahya, H.
Yano, K.
Zaitsev, M.
Zanne, P. Zech, S. Zeilhofer, H.-F.
Zimmermann, M.
Zinser, M.
Zoller, J. E.
Institute for Process Control and Robotics (IPR), University of Karlsruhe (TH), Building 40.28, Engler-Bunte-Ring 8, D-76 128 Karlsruhe, Germany Clinical Research Unit, Tan Tock Seng Hospital, 11 Jalan Tan Tock Seng, Singapore 308433 MAQUET GmbH & Co. KG, Kehler Strafle 31, D76437 Rastatt, Germany Johann Wolfgang Goethe University, Institute of Neuroradiology, Schleusenweg 2- 16, D-60528 FrankhdMain, Germany Toyohashi University of Technology, Department of Production Systems Engineering, Hibarigaoka 1- 1, Tempaku, 441-8580 Toyohashi, Japan University Hospital of Freiburg, Department Radiology, Hugstetter Str. 55, D-79 106 Freiburg, Germany LSIIT, Parc &Innovation,Boulevard SCbastien Brant, BP 10413, F-67412 Illkirch Cedex, France Trauma Department, Hannover Medical School, Carl-Neuberg-Str. 1, D-30625 Hannover, Germany University Hospital Basel, Spitalstrasse 2 1, CH403 1 Basel, Switzerland HFZ, TU Munich, Ismaningerstr. 22, D-8 1675 Munich, Germany Department of Neurosurgery, Johann Wolfgang Goethe University, Schleusenweg 2- 16, D-60528 FrankfuMain, Germany Department of Cranio-Maxillofacialand Plastic Surgery, University of Cologne, Kerpener Str. 62, D-50937 Koln, Germany Department of Cranio-Maxillofacial and Plastic Surgery, University of Cologne, Kerpener Str. 62, D-50937 Koln, Germany