A method and apparatus for collecting and processing physical space data used while performing image-guided surgery is disclosed. physical space data is collected by probing physical surface points of surgically exposed tissue. The physical space data provides three-dimensional (3-D) coordinates for each of the physical surface points. Based on the physical space data collected, point-based registrations used to indicate surgical position in both image space and physical space are determined. The registrations are used to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the instrument, the surgically exposed tissue, and a particular portion of the tissue to be resected or ablated. The image data is updated on a periodic basis.

Patent
   6584339
Priority
Jun 27 2001
Filed
Jun 27 2001
Issued
Jun 24 2003
Expiry
Jul 16 2021
Extension
19 days
Assg.orig
Entity
Large
85
29
all paid
24. A method of collecting and processing physical space data for use while performing image-guided surgery, the method comprising:
(a) surgically exposing tissue of a living patient;
(b) collecting physical space data by probing a plurality of physical surface points of the exposed tissue, the physical space data providing three-dimensional (3-D) coordinates for each of the physical surface points;
(c) based on the physical space data collected in step (b), determining point-based registrations used to indicate surgical position in both image space and physical space;
(d) using the registrations determined in step (c) to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the instrument, the tissue, and a particular portion of the tissue to be resected or ablated; and
(e) updating the image data on a periodic basis.
17. An article of manufacture for collecting and processing physical space data for use while performing image-guided surgery, the article of manufacture comprising a computer-readable medium holding computer-executable instructions for performing a method comprising:
(a) determining point-based registrations used to indicate surgical position in both image space and physical space by processing physical space data collected by probing a plurality of physical surface points of surgically exposed tissue of a living patient, the physical space data providing three-dimensional (3-D) coordinates for each of the physical surface points;
(b) using the point-based registrations to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the ablative instrument, the tissue, and a particular portion of the tissue to be resected or ablated; and
(c) updating the image data on a periodic basis.
1. Apparatus for collecting and processing physical space data for use while performing image-guided surgery, the apparatus comprising:
(a) a probe instrument for collecting physical space data by probing a plurality of physical surface points of surgically exposed tissue of a living patient, the physical space data providing three-dimensional (3-D) coordinates for each of the physical surface points;
(b) an ablative instrument for resecting or ablating a particular portion of the exposed tissue; and
(c) an image data processor comprising a computer-readable medium holding computer-executable instructions for:
(i) based on the physical space data collected by the probe instrument, determining point-based registrations used to indicate surgical position in both image space and physical space;
(ii) using the point-based registrations to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the ablative instrument, the tissue, and a particular portion of the tissue to be resected or ablated; and
(iii) updating the image data on a periodic basis.
2. The apparatus of claim 1, wherein the ablative instrument emits a plurality of intermittent infrared signals used to triangulate the position of the ablative instrument in 3-D image space, the signals being emitted from a plurality of infrared emitting diodes (IREDs) distributed over the surface of a handle of the ablative instrument in a spiraling fashion.
3. The method of claim 2, wherein the IREDs flash in time sequence.
4. The method of claim 3, wherein each IRED has a 60 degree transmission angle.
5. The apparatus of claim 1, further comprising:
(d) a scanning device for scanning tissue of the patient to acquire, store and process a 3-D reference of tissue prior to the tissue being surgically exposed, wherein the image data processor creates a triangularized mesh based on the scanned tissue, determines the volumetric center of a particular portion of the tissue to be resected or ablated during the surgery, and implements an algorithm using the triangularized mesh and the physical space data collected by the probe instrument to determine the point-based registrations.
6. The apparatus of claim 5, wherein the algorithm is a Besl and Mackay iterative closest point (ICP) registration algorithm.
7. The apparatus of claim 5, wherein the scanning device is one of the following scanners: a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner and a positron emission tomography (PET) scanner.
8. The apparatus of claim 1, wherein the ablative instrument has a tip comprising an ablation device.
9. The apparatus of claim 8, wherein the ablation zone extends 1 centimeter from the tip of the ablative instrument.
10. The apparatus of claim 1, wherein the probe instrument is swept over the surface of the exposed tissue.
11. The apparatus of claim 1, wherein the image data is updated in real time at 30 Hz or greater.
12. The apparatus of claim 1, wherein the ablative instrument uses one of radio-frequency and cryoablation to resect or ablate the particular portion of the tissue.
13. The apparatus of claim 1, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) image space.
14. The apparatus of claim 1, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) laparoscopic video space using a direct linear transformation (DLT).
15. The apparatus of claim 1, wherein points from 3-D physical space are mapped to 3-D tomographic image space.
16. The apparatus of claim 1, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) endoscopic image space.
18. The article of manufacture of claim 17, wherein the computer-executable instructions perform a method further comprising:
(d) creating a triangularized mesh based on a 3-D reference of tissue of the patient, the 3-D reference being acquired, stored and processed prior to the tissue being surgically exposed;
(e) determining the volumetric center of a particular portion of the tissue to be resected or ablated during the surgery; and
(f) implementing an algorithm using the triangularized mesh and the physical space data to determine the point-based registrations.
19. The article of manufacture of claim 18, wherein the algorithm is a Besl and Mackay iterative closest point (ICP) registration algorithm.
20. The article of manufacture of claim 17, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) image space.
21. The article of manufacture of claim 17, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) laparoscopic video space using a direct linear transformation (DLT).
22. The article of manufacture of claim 17, wherein points from 3-D physical space are mapped to 3-D tomographic image space.
23. The article of manufacture of claim 17, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) endoscopic image space.
25. The method of claim 24, wherein the ablative instrument emits a plurality of intermittent infrared signals used to triangulate the position of the ablative instrument in 3-D image space, the signals being emitted from a plurality of infrared emitting diodes (IREDs) distributed over the surface of a handle of the ablative instrument in a spiraling fashion.
26. The method of claim 25, wherein the IREDs flash in time sequence.
27. The method of claim 26, wherein each IRED has a 60 degree transmission angle.
28. The method of claim 24, further comprising:
(f) prior to surgery, scanning tissue of the patient to acquire, store and process a 3-D reference;
(g) creating a triangularized mesh based on the scanned tissue; and
(h) determining the volumetric center of a particular portion of the tissue to be resected or ablated during the surgery, wherein an algorithm using the triangularized mesh and the physical space data collected in step (b) is implemented to determine the registrations in step (c).
29. The method of claim 28, wherein the algorithm is a Besl and Mackay iterative closest point (ICP) registration algorithm.
30. The method of claim 28, wherein step (f) is performed by one of a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner and a positron emission tomography (PET) scanner.
31. The method of claim 24, wherein the ablative instrument has a tip comprising an ablation device.
32. The method of claim 31, wherein the ablation zone extends 1 centimeter from the tip of the ablative instrument.
33. The method of claim 24, wherein step (b) comprises sweeping an optically tracked localization probe over the surface of the exposed tissue.
34. The method of claim 24, wherein the tissue is the patient's liver and the particular portion of tissue to be resected or ablated is a hepatic metastatic tumor.
35. The method of claim 24, wherein the image data is updated in real time at 30 Hz or greater.
36. The method of claim 24, wherein the ablative instrument uses one of radio-frequency and cryoablation to resect or ablate the particular portion of the tissue.
37. The method of claim 24, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) image space.
38. The method of claim 24, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) laparoscopic video space using a direct linear transformation (DLT).
39. The method of claim 24, wherein points from 3-D physical space are mapped to 3-D tomographic image space.
40. The method of claim 24, wherein points from 3-D physical space are mapped to 2-dimensional (2-D) endoscopic image space.

This work was supported in part by grants from the National Institutes of Health (NIH) and the National Science Foundation (NSF) (NIH grants NIGMS GM52798 and NRSA #1 F32 DK 09671-01 SB; and NSF grant BES-9703714) and the U.S. Government may therefore have certain rights in this invention.

1. Field of the Invention

The present invention relates to using image-guided surgery techniques to collect data to insure accurate tracking of an ablation device.

2. Background Information

For over fifty years, diagnostic images have been used for surgical guidance, especially in the field of neurosurgery. Image-guided surgery implements two fundamental ideas: first, the concept of an image-space to physical-space mapping or registration, and second, the use of an extracranial device for accurate surgical guidance without direct visualization. Such ideas gave birth to stereotactic neurosurgery, a technique for locating targets of surgical interest within the brain relative to an external frame of reference. This is traditionally defined as the temporary attachment of a mechanical frame to the skull or scalp in order to define a 3-D frame space around a patient. With the advent of computed tomography (CT), the coordinates of a target (i.e. tumor) in image space could be assigned coordinates in frame space if the CT images were obtained with the attached frame. Unfortunately, frames are uncomfortable to patients, must be applied prior to imaging, and are cumbersome in the imaging environment and the operating room.

These factors led to the development of frameless stereotactic surgical systems, or interactive, image-guided surgery (IIGS) systems. In traditional IIGS systems, present surgical position is tracked during an operation and displayed on pre-operatively obtained tomographic images. As the surgeon changes the current surgical position, displayed images are updated in real time. In one of the earliest IIGS systems, physical space surgical position was determined using articulated arms. The position of an articulated pointer was calculated using a personal computer (PC) and overlayed on tomographic images. Magnetic resonance images (MRI) and CT negative films were scanned into the computer and displayed as images on a video interface. Other early image-guided surgical systems also used electromechanical 3-D coordinate digitizers to indicate present surgical position on various representations of patient data, including 2-D transverse, coronal and sagittal CT or MRI slices, and on image renderings of the physical object surface. Since it was necessary to have computers capable of managing large volumes of image information (>100 Mbytes) and updating the display quickly, most early IIGS systems were developed with VME bus devices running UNIX.

Early IIGS systems were developed on PCs using multiple processors. In a task-oriented asymmetric multiprocessing (TOAM) system developed in 32 bit extended DOS, discrete tasks such as physical space localization, data fetching, and display were conducted asynchronously on specialized processors which communicated with inexpensive, general purpose processors that worked as loaders and schedulers. For physical space localization, several articulated arms with six degrees of freedom were first developed. These cumbersome arm devices were eventually replaced with lightweight cylindrical pen-like probes which could be tracked more easily in the operating room using an optical triangulation system. The spatial location of the guidance instrument was determined using a collection of discrete processors which continually update the physical space location. This location was then passed to the central processor where it was mapped into image space. Once the image space map was complete, the appropriate tomographic slices were selected and displayed. Because this system was designed before the advent of large memory availability, image display relied heavily on hardware manipulation using disk controllers to load images directly from the hard drive. Control of the bus was passed from the main processor to the disk drive controller, where the correct image was fetched and sent to the display processor.

With the continuing increase in performance to price, processes which could only be performed on workstation class machines are now routinely performed on PCs. As the PC hardware evolved, however, it became apparent that DOS-based systems would not have the continuing support of hardware vendors.

Because of these considerations, a need for an operating room image-oriented navigation system (ORION) emerged. ORION was developed in Windows NT using MS Visual C++ 6.0 with the Win32 API. Not only was this system designed to be faster than the previous one, but it was not necessary to redesign the software with each hardware advance. Components of the system were developed as dynamic link libraries (DLLs), so that new technology could be incorporated into the system without a complete software rewrite. The system is also somewhat portable. It runs adequately on any PC with a 200 MHz or higher Pentium processor and 128 MB of memory which also has the appropriate video card and 3-D localizer hardware and software.

When designing an image-guided surgical system, it is critical that the precise location of an ablative instrument used to perform image-guided surgery be determined on a continuous basis (e.g., update rates approaching 30 frames per second). Further, in an effort to insure the utmost in precision, an ablation zone of the ablative instrument, the tissue being operated on, and a particular portion of the tissue to be resected or ablated during surgery must also be continuously and accurately tracked.

What is needed is a method and apparatus for collecting and processing physical space data for use while performing image-guided surgery, such that tumors and lesions in the tissue of a living patient can be accurately located, and resected or ablated with a precisely tracked ablative instrument. The present invention fulfills such a need.

In interactive, image-guided surgery, current physical space position in the operating room is displayed on various sets of medical images used for surgical navigation. The present invention is a PC-based surgical guidance system which synchronously displays surgical position on up to four image sets and updates them in real time. There are three essential components and techniques which have been developed for this system: 1) accurately tracked ablative instruments, 2) accurate registration techniques to map physical space to image space, and 3) methods and apparatus to display and update the image sets on a computer monitor. For each of these components, a set of dynamic link libraries has been developed in MS Visual C++ 6.0 supporting various hardware tools and software techniques. Surgical (i.e., ablative) instruments are tracked in physical space using an active optical tracking system. Several different registration algorithms were developed with a library of robust math kernel functions, and the accuracy of all registration techniques have been thoroughly investigated. The present invention was developed using the Win32 API for windows management and tomographic visualization, a frame grabber for live video capture, and OpenGL for visualization of surface renderings. This current implementation of the present invention can be used for several surgical procedures, including open and minimally invasive liver surgery.

In a method according to the present invention, physical space data is collected and processed for use while performing image-guided surgery. Tissue of a living patient is surgically exposed. Physical space data is then collected by probing a plurality of physical surface points of the exposed tissue, the physical space data providing three-dimensional (3-D) coordinates for each of the physical surface points. Based on the physical space data collected, point-based registrations used to indicate surgical position in both image space and physical space are determined. The registrations are used to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the instrument, the tissue, and a particular portion of the tissue to be resected or ablated. The image data is updated on a periodic basis.

The collection of physical space data may be performed by sweeping an optically tracked localization probe over the surface of the exposed tissue. The tissue may be the patient's liver and the particular portion of tissue to be resected or ablated may be a hepatic metastatic tumor.

Prior to surgery, tissue of the patient may be scanned to acquire, store and process a 3-D description of the organ or structure of interest (e.g., a 3-D reference). A triangularized mesh may be created based on the scanned tissue. The volumetric center of a particular portion of the tissue to be resected or ablated during the surgery may be determined, wherein an algorithm using the triangularized mesh and the collected physical space data may be implemented to determine the point-based registrations. The algorithm may be a Besl and Mackay iterative closest point (ICP) registration algorithm.

The scanning of the tissue may be performed by one of a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner and a positron emission tomography (PET) scanner.

The ablative instrument may emit a plurality of intermittent infrared signals used to triangulate the position of the ablative instrument in 3-D image space. The signals may be emitted from a plurality of infrared emitting diodes (IREDs) distributed over the surface of a handle of the ablative instrument in a spiraling fashion. The IREDs may flash in time sequence. Each IRED may have a 60 degree transmission angle.

The image data may be updated in real time at 30 Hz or greater. The ablative instrument may use one of radio-frequency and cryoablation to resect or ablate the particular portion of the tissue. The ablative instrument may have a tip comprising an ablation device. The ablation zone may extend 1 centimeter from the tip of the ablative instrument.

Information from the localizer may also be used in conjunction with laparoscopic or endoscopic imaging. Points from 3-D physical space may be mapped to 2-dimensional (2-D) image space. Points from 3-D physical space may be mapped to 2-dimensional (2-D) laparoscopic video space using a direct linear transformation (DLT). Points from 3-D physical space may be mapped to 3-D tomographic image space. Points from 3-D physical space may be mapped to 2-dimensional (2-D) endoscopic image space.

In an apparatus according to the present invention, physical space data is collected and processed for use while performing image-guided surgery. The apparatus comprises a probe instrument, an ablative instrument and an image data processor. The probe instrument collects physical space data by probing a plurality of physical surface points of surgically exposed tissue of a living patient. The physical space data provides three-dimensional (3-D) coordinates for each of the physical surface points. The ablative instrument may resect or ablate a particular portion of the exposed tissue.

The image data processor comprises a computer-readable medium holding computer-executable instructions which, based on the physical space data collected by the probe instrument, determine point-based registrations used to indicate surgical position in both image space and physical space. Using the point-based registrations to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the ablative instrument, the tissue, and a particular portion of the tissue to be resected or ablated. The image data is updated on a periodic basis.

The probe instrument may be swept over the surface of the exposed tissue. The apparatus may also comprise a scanning device for scanning tissue of the patient to acquire, store and process a 3-D reference of tissue prior to the tissue being surgically exposed. The image data processor creates a triangularized mesh based on the scanned tissue, determines the volumetric center of a particular portion of the tissue to be resected or ablated during the surgery, and implements an algorithm using the triangularized mesh and the physical space data collected by the probe instrument to determine the point-based registrations. The algorithm is a Besl and Mackay iterative closest point (ICP) registration algorithm.

The scanning device may be one of the following scanners: a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner and a positron emission tomography (PET) scanner.

The ablative instrument may emit a plurality of intermittent infrared signals used to triangulate the position of the ablative instrument in 3-D image space, the signals being emitted from a plurality of infrared emitting diodes (IREDs) distributed over the surface of a handle of the ablative instrument in a spiraling fashion. The IREDs may flash in time sequence. Each IRED may have a 60 degree transmission angle. The image data may be updated in real time at 30 Hz or greater.

The ablative instrument may use one of radio-frequency and cryoablation to resect or ablate the particular portion of the tissue. The ablative instrument may have a tip comprising an ablation device. The ablation zone may extend 1 centimeter from the tip of the ablative instrument.

Points from 3-D physical space may be mapped to 2-dimensional (2-D) image space. Points from 3-D physical space may be mapped to 2-dimensional (2-D) laparoscopic video space using a direct linear transformation (DLT). Points from 3-D physical space may be mapped to 3-D tomographic image space. Points from 3-D physical space may be mapped to 2-dimensional (2-D) endoscopic image space.

The following detailed description of preferred embodiments of the present invention would be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present invention, there are shown in the drawings embodiments which are presently preferred. However, the present invention is not limited to the precise arrangements and instrumentalities shown. In the drawings:

FIG. 1 shows a general flow chart in accordance with the present invention.

FIG. 2 shows a detailed flow chart illustrating how collected and processed physical space data is used to perform image-guided surgery in accordance with the present invention.

FIG. 3 shows the hardware system configuration of the present invention.

FIG. 4 shows the basic software architecture of the present invention, including the three divisions of run-time dynamic link libraries.

FIG. 5 a flow chart for software used by the present invention.

FIG. 1 shows some of the major events involved in preparing for and performing IIGS. In step 105, it is determined if any extrinsic markers will be attached to the patient. These makers, or fiducials, are designed to be imaged and then localized in both image space and physical space for use in a point-based registration algorithm. Appropriate image volumes for a patient are then acquired, stored and processed (step 110). Most image volumes are acquired as a set of slices, 256×256 or 512×512 pixels per slice at 2 bytes per pixel with 20-200 images per volume. These images are acquired on a computerized tomography (CT) scanner, a magnetic resonance imaging (MRI) scanner or a positron emission tomography (PET) scanner. Images are typically transferred from PACS servers in radiology to image-guided surgical computers, where they are processed for display before surgery if necessary. Transverse, sagittal, or coronal tomographic slices require minor processing before display. In order to visualize surface renderings, a triangulated surface can be created from the volume and displayed. These triangulated surfaces can also be used in registration algorithms to map a physical surface to an image surface.

Once the surgeon has prepared and positioned the patient for surgery, it is necessary to register or map patient or physical space to image space before images can be used interactively (step 115). Physical space data is collected using an instrument whose position is tracked in the operating room. For point-based registrations, corresponding points that can be localized accurately in both image space and physical space are used to create the mapping. These can include extrinsic fiducials that were attached to the patient before imaging or intrinsic fiducials which include anatomic landmarks visible in both physical and image space. For surface-based registrations, a surface of physical space points is collected and mapped onto a triangulated surface. After the accuracy of the registration is assessed, the tracked instrument is moved in physical space and the corresponding position in image space is displayed (steps 120, 125 and 130) and used as a guide during the surgical procedure.

In order to carry out the tasks of determining the location of a tracked probe in space, registering that position into image space, and displaying the appropriate image or images on a computer screen, three divisions of run-time dynamic link libraries (DLLs) were initially developed for our system. The first is a localizer division, which is responsible for determining current surgical position based on the location of an instrument in space. The second is a registration division, which calculates accurate mappings between image space and physical space. The third is a display division, which displays various types of medical images on the computer screen. By separating the system into several DLL divisions, the present invention allows for modularity. Depending on a particular surgical case, for instance, a surgeon may choose one type of localizer, one or more types of displays, and one or more types of registration to indicate surgical position on the images.

Because an image-guided surgical system is, by definition, used in surgery by surgeons and surgical staff, the present invention has an intuitive interface and display to minimize the potential for distraction. Using a simple pushbutton interface, a patient is selected from the database, physical space data is collected, and a registration between physical space and image space is calculated. Images are then selected and displayed in one of four quadrants on the screen and updated as surgical position changes.

A surgeon can use the present invention for guidance in extremely delicate procedures, such as removing a tumor from a sensitive region of the brain or resect multiple lesions in regions of the liver near major blood vessels. It is important that current surgical position displayed on medical images be as close to actual surgical position as possible. It is also imperative that the images are updated quickly, since a surgeon may change his or her current location rapidly in order to gather spatial information about a certain structure and its surroundings. Accordingly, there are two performance goals in the development of the present invention: accuracy and speed. It is necessary to develop accurate, automatic registration algorithms to map physical space to image space. These algorithms should also be computationally efficient since time constraints are always a concern in surgical applications. It is also important to develop accurately tracked instruments to indicate current surgical position in physical space. Once the registration is calculated and instrument location is mapped into image space, it is necessary to update the images as fast as possible, since surgeons are sensitive to display speed. One of the worst possible scenarios would be for the surgeon to move the probe and make a decision based on the current display, only to find that the display indicated a previous position. A true image-guided surgical system should update images in real time at 30 Hz as surgical position changes.

The present invention collects and processes physical space data for use while performing image-guided surgery, as illustrated in the flow chart of FIG. 2. Prior to surgery, tissue of the patient is scanned to acquire, store and process a 3-D reference (step 205). A triangularized mesh is then created based on the scanned tissue (step 210). The volumetric center of a particular portion of the tissue to be resected or ablated during the surgery is determined, wherein an algorithm using the triangularized mesh and the collected physical space data may be implemented to determine the point-based registrations (step 215). The algorithm may be a Besl and Mackay iterative closest point (ICP) registration algorithm.

Tissue of a living patient is then surgically exposed (step 220). Physical space data is then collected by probing a plurality of physical surface points of the exposed tissue, the physical space data providing three-dimensional (3-D) coordinates for each of the physical surface points (step 225). Based on the physical space data collected, point-based registrations used to indicate surgical position in both image space and physical space are determined (step 230). The registrations are used to map into image space, image data describing the physical space of an ablative instrument used to perform the image-guided surgery, an ablation zone of the instrument, the tissue, and a particular portion of the tissue to be resected or ablated (step 235). The image data is updated on a periodic basis (step 240).

FIG. 3 shows the hardware system configuration 300 of the present invention. ORION was developed in Windows NT and is currently running on a 400 MHz processor Micron PC (an image data processor) 305 with 256 MB of memory and a display monitor 310. The display mode is 1280×1024 mode. This computer also contains two specialized cards. The VigraVision-PCI card (VisiCom Inc., Burlington, Vt.) is a combination color frame grabber and accelerated SVGA display controller which is capable of displaying NTSC video images in real time. An ISA high-speed serial port card communicates with the optical localization probe 320 via control box 315. Additional hardware for implementing the present invention include an optical tracking sensor 325, optical localization probe(s) 320, and an optical reference emitter 330.

Other paradigms for indicating surgical position could be used, including articulated arms. One preferred localization tool for use with the present invention is the Optotrak 3020 (Northern Digital Inc., Waterloo, Ontario). The optical tracking sensor 325 contains three cylindrical lenses which receive light from sequentially strobed infrared light-emitting diodes (IREDs). Triangulation is used to find each IRED relative to the position of the optical tracking sensor 325.

In order for the position and orientation of ablative instrument 320 to be measured by the optical tracking sensor 325, the ablative instrument must have a handle (rigid body) with multiple IREDs distributed over the surface of a handle of the ablative instrument 320 so that at least three IREDs are visible in all of the appropriate orientations of the ablative instrument 320. If three or more IREDs attached to the handle of the ablative instrument 320 are detected by the lenses of the optical tracking sensor 325, the tip and ablation zone of the ablative instrument 320 can be accurately localized in physical space without placing constraints on how the ablative instrument 320 needs to be handled by the surgeon.

The typical ablative instrument 320 used in neurosurgical applications has 24 IREDs which spiral around the instrument's handle. It is appropriate for use as a surgical pointer because it is light, easily directed and is extremely accurate with a tip location error of 0.35 mm in 3-D space. For endoscopic applications, a special rigid body was created for ablative instrument 320 which is more sensitive to roll for more complex manipulations. This 24 IRED "ovoid structure" attached to the endoscope weighs less than 200 g (adding less than 10% to the weight of a typical endoscope). The tip of the endoscope may be tracked with an accuracy of approximately 1.2 mm. An optically tracked radiofrequency probe is placed within the center of tumors, where it is used to microwave or heat-kill lesions. The present invention is able to track the tip of this device with an accuracy of 3.0 mm.

For surgical applications using the present invention, an ablative instrument 320 is used which not only defines a coordinate system in physical space but also preserves the registration if the patient is moved. The optical tracking sensor 325 can localize both the ablative instrument 320 and the reference emitter 330 in sensor unit space. By mapping the position of the ablative instrument 320 into the space defined by the position and orientation of the reference emitter 330, the location of the optical tracking sensor 325 drops out of the equations. The optical tracking sensor 325 can be flexibly placed before surgery and moved during the procedure to accommodate any surgical requirements.

All of the image-guided surgical software in accordance with the present invention was written using Visual C++ 6.0 in Windows NT 4∅ Because the Win32 API offers the greatest versatility in exploiting the features of Windows, this interface was chosen to create and manage the windows created in the system.

The present invention incorporates an executable program of a software system which contains the main entry point to the Windows program. Windows NT passes all user input to programs in the form of messages. Thus, the present invention implements a message pump that receives and dispatches these messages to the appropriate message handler(s). At startup, the present invention is responsible for initializing the image-guided surgery system. This task involves creating and managing the four 512×512 child windows used for image display, saving log information with the time and date the system is run, and loading the dynamic link libraries (DLLs). After initialization, the message pump in the present invention is responsible for receiving and dispatching any messages in the queue. If no messages are present, it sends information concerning the current position of the localized instrument(s) to the child windows.

FIG. 4 shows the basic software architecture of the present invention, including the three divisions of run-time dynamic link libraries. For each of the three divisions, a core set of functions and structures define an interface to the DLLs. The interface represents the functionality that is required to interoperate with the present invention. Any additional functionality present within the DLL is expected to be specific to that individual DLL and not visible to the present invention. There are two functions that are common to all of the DLL interfaces, one which can be called by the present invention to receive a message related to the most recent error that occurred within the library, and another which is called to receive the identification tag for a particular type of DLL. A brief description of DLL division structure is included in the sections below, along with details about the development of particular types within a division.

The localizer division 405 consists of all the DLLs developed to track the position of instruments in physical space. The DLL interface for this division defines a common communication mechanism to various hardware devices which typically rely on vendor-supplied custom APIs for device communication. The task of implementing a localizer DLL is therefore largely a matter of grouping the API functions of the vendor into the required localizer interface functions, and ensuring correct data type conversion and error handling. A single localizer DLL is selected at startup, and all communications are performed through the interface.

Currently, localizer DLL 405 is written for the Optotrak 3020 system described previously using the Northern Digital software libraries. Several instruments can be tracked in space at once, if necessary. A function in the DLL returns 4×4 matrices that indicate the position and orientation of the optically tracked instrument(s). If either the tracked instrument(s) or the reference rigid body are blocked from the camera view or are unable to be localized with a certain accuracy, an error message is generated and the user is notified. Individual points can be collected with the Optotrak and stored in a file for use in registration algorithms or other analysis, and an entire set of data points can be collected as well. DLLs may be used for other optical tracking devices, such as the Polaris system from Northern Digital, or other non-optical systems, such as articulated arms.

The registration division 410 consists of all DLLs developed to perform registrations between image space and physical space. A generalized user interface (not shown) is used to select physical and image data models to be passed to the appropriate registration DLL. Within the individual DLLs, specific registration algorithms are used to calculate a mapping between the data sets. The interface of the present invention also allows multiple registrations for a given data set, if desired.

Two point-based registration DLLs map 3-D physical space into 3-D image space using rigid body transformations. One DLL finds a closed-form solution based on unit quaternions to create this transformation matrix. The other DLL uses singular value decomposition (SVD) to find the closed-form solution. In order to determine the transformation matrix, both algorithms require the localization of three or more corresponding non-colinear fiducial points in both 3-D spaces. The SVD algorithm is implemented using the Intel Math Kernel Library, which contains a set of robust math kernel functions that are performance optimized for Pentium processors. Another DLL implements a projective registration algorithm based on the direct linear transformation (DLT). If 6 or more non-coplanar fiducials are accurately localized in the two spaces, a projective registration between 3-D physical space and 2-D endoscopic video space can be created. The Intel library is used to calculate the SVD utilized in the DLT registration. A surface-based registration DLL based on the iterative closest-point algorithm of Besl and McKay is performed using the present invention.

One performance goal of the present invention is to develop accurate registration algorithms. For the rigid body point-based registrations, two measures of error are defined. The residual error determined in mapping the fiducial points from one 3-D space to the other is called fiducial registration error (FRE). If a mapped point not used in the creation of the transformation matrix is compared to its actual position, the difference is referred to as target registration error (TRE). For 600 neurosurgical applications conducted using externally attached fiducial markers for registration purposes, the mean TRE in mapping from physical space to CT image space was 0.67±0.25 mm, with a worse case of 1.15 mm. The accuracy of the DLT registration algorithm used to project 3-D physical space into endoscopic video space has been investigated, and it has been found that registration accuracy improves as more fiducials are used to calculate the transformation matrix. When 11 fiducials are utilized in the computation of the DLT, average TRE is 0.70 mm. Considerable work has been performed in quantifying the accuracy of many other types of registrations as well. All of the registration DLLs developed will return an appropriate quality of fit measure which helps indicate the accuracy of the registration technique(s) utilized in the surgical procedure. If the registration is point-based, a TRE/FRE metric is used for assessing registration quality. If the registration is surface-based, a new metric has been developed which includes holding out certain subsurfaces from inclusion in the registration process. These surfaces are then used as an independent check of registration quality, along with visual assessment.

The display division 415 consists of all the DLLs necessary to display any type of medical image. Two display DLLs have been developed, one type for the display of any tomographic image (MR, CT, PET), and another type for the display of any NTSC video image.

For each child window displaying a tomographic view, a device independent bitmap (DIB) is created. Images are displayed through a bit-block transfer of 8 bit data corresponding to a rectangle of pixels from the specified source device context into a destination device context (i.e., one of the four child windows).

Since most medical images are represented using 4096 individual gray levels (12 bits) and standard video hardware can only display 256 different gray levels (8 bits), some form of compression must be implemented. The compression used by the present invention is piecewise linear and is defined by the window and level parameters. The level parameter determines the uncompressed intensity that is mapped to the middle gray level. The window parameter describes the range of uncompressed intensities surrounding the level value that are mapped to shades of gray. Anything below this window is mapped to black (zero intensity) and anything above the window is mapped as white (full intensity). Tomographic images utilized during surgery are loaded into memory and saved in one of two formats depending on the desired method of display. In one format, the 12 bit/pixel grayscale images are temporarily loaded into memory and compressed to 8 bits/pixel based on the default window and level. The 12 bit images are deleted and the 8 bit images are stored in memory. When the position of the tracked instrument is moved, the appropriate 8 bit image is displayed using a bit-block transfer. Images displayed using this method are updated at greater than 30 Hz when all four windows are displaying tomograms. If the window and level of an image is changed, the entire corresponding volume must be reloaded and each image in the volume is compressed to 8 bits using the new values. The time required to compress the images with the updated window and level varies with the size and quantity of the images. Since a surgeon may want to change the window and level of an image set multiple times during a procedure in order to view different structures more clearly, a second alternative display method may be implemented.

In the alternative display method, the 12 bit volume is loaded just once. Once an appropriate image is selected for viewing, its 12 bit data is compressed to 8 bits based on the given window and level and the resulting DIEB is displayed. This "compress on the fly" method allows images to be windowed and leveled without reloading the entire corresponding volume. On our 400 MHz PC, images displayed using this method are updated at 15-20 Hz when all four windows are displaying tomograms. Either method can currently be used for display during surgery. As faster processors are developed, the "compress on the fly" method will update tomograms at an acceptable speed (>30 Hz) to meet performance goals of the present invention and the first method will be eliminated.

Cropped NTSC video images (512×480) are captured with the VigraVision-PCI color frame grabber. This can be used to display an endoscopic or intraoperative ultrasound (IOUS) view. One video image can be updated at 30 Hz in any of the four quadrants. A still video image can be captured and stored in one of several formats, such as a TIFF or a bitmap. When the image is frozen, the position and orientation of the tracked endoscope or IOUS can be saved, and points on the image can be localized and used for projective registration purposes.

Several new display DLLs have been developed for surgical applications. In a graphics DLL, an OpenGL, a 3-D graphics library, is used to render object surfaces created from triangulated tomographic volumes. Tumors or other structures segmented from these volumes can be also rendered and projected onto the object surface to aid in localization. For surgical applications, the rendering view is updated "on the fly" as surgical position is changed. In the rotation DLL, renderings and vascular projection images are created pre-operatively and the desired angular projection is displayed. The position of the probe and trajectory information is projected onto the view according to the reconstruction angle.

FIG. 5 shows a software cycle for image-guided surgery in accordance with the present invention. After execution in Windows NT, the system is initialized and all appropriate DLLs are loaded. A patient information file is then selected from a database. This file contains a list of files indicating image sets that are available for the particular surgical case. Each image set has an associated information file which is included in the patient file. This image information file contains a path to the actual image data, a tag to identify the appropriate DLL for its display, various parameters needed to visualize the data, and all registration types and corresponding image data needed for mapping physical space to image space. Once the image sets are loaded and their child window location is determined, it is usually appropriate to collect some type of physical space data using the localizer DLL. This data may be surface points or fiducial points which can be used in the surface-based or point-based registration algorithms described previously. Physical space and image space data is passed to the appropriate registration DLL and a mapping in the form of a matrix is returned to the present invention. Following registration between physical space and image space, the image sets or other display type is then visualized in some or all of the four child windows. The present invention then enters its message loop and maintains a message queue. The program is now ready to receive any keyboard or mouse input from the user. If there is a message in the queue, the present invention responds to the message and checks for other messages. There is "dead time" in the present invention when the program is idle waiting for messages (e.g., keyboard and mouse inputs). If no messages exist in the queue, the present invention receives matrix information from the localizer DLL concerning the current position of the probe(s) in physical space and passes this and any registration matrices to the appropriate display DLLs. The display DLLs map this physical space position into image space using the appropriate registration result and display the updated images, along with a colored circle to indicate current surgical position.

There are several pushbuttons and mouse commands in the present invention that produce messages. For example, if the user presses the right mouse key in a tomogram display window, the window and level of the image volume can be adjusted. The surgeon can also perform another type of registration, collect some type of physical space data with the tracked instrument, or change the display by pressing one of several pushbuttons on the main window. Each of these tasks are kept behind the simple pushbutton so that during surgery the physician can concentrate on the images displayed. As in the first example, the present invention receives a mouse click over a pushbutton as a message and responds to this request before updating the probe position and the images.

Interactive, image-guided surgery is being utilized in more and more applications, including neurosurgery, spinal surgery, and endonasal surgery. The present invention can be used for general surgical applications, including hepatic surgery.

Surgical treatment of hepatic tumors is performed by removal or ablation of involved segments of the liver. These tumors are localized by preoperative imaging studies, primarily CT imaging, intra-operative ultrasound and palpation. Of these localization techniques only the preoperative tomograms provide high-resolution, 3-D views of the tumor and the surrounding anatomy. However, at present, the tomographic information cannot be actively utilized for surgical guidance in the operating room. Thus, surgeons use other methods for operative tumor localization. It is especially important to accurately localize the tumor during liver ablation procedures, where precise probe placement within the volumetric center of a tumor is critical in the radiofrequency (heat-kill) or cryoablation (freeze-thaw) of the lesion. It is believed that the development of the present invention for hepatic surgery will improve both open and minimally invasive resection and ablation procedures in the liver. Minimally invasive hepatic surgery is currently conducted on a very limited basis. An endoscopic-IIGS system would combine the strengths of real time video imaging and the tomographic guidance from IIGS and make these procedures feasible. Deep-seated tumors which are indicated on CT images will be mapped onto video images of the liver surface using the direct linear transformation (DLT) and then displayed to allow the surgeon more accuracy in performing resections or ablative procedures under endoscopic guidance.

The present invention has been used in the laboratory on phantom livers. These model livers were constructed with rubber silicone, which was poured into a plaster mold along with spherical "tumors" constructed from cork. CT images of the phantom were acquired and a surface was created for registration purposes. The surface of the liver is digitized using the tracked probe and the present invention. The registration of the digitized liver surface to the surface of the liver created from the CT scan was calculated using an implementation of the iterative closest point registration algorithm. Experiments using the phantom liver and the described registration technique produced an average registration error of 1.8 mm. This surface registration calculation and the tracked RF probe have been used to localize the centers of "tumors" within the phantom on the CT images, and it was possible to place the tip of the instrument to within 1.2-6.0 mm of tumor centroids. Errors in this range can be achieved using the surface of the liver intraoperatively.

Experiments have been also conducted to develop a surgical protocol for using the present invention for open hepatic resection in patients with liver tumors, and tracked points on the liver to determine respiratory associated hepatic movement. An ablative instrument was placed on three anatomical points on the liver. Approximately 950 localization points (x,y,z) were continuously collected using the present invention. Each patient (n=2) had >2800 localization points collected during continuous respiratory cycles with standard continuous mandatory ventilator cycling. The change in position of the tracked points with respiration was calculated relative to the resting base position of the liver. Average motion with respiration for all anatomical points was 10.3±2.5 mm.

Interactive, image-guided surgery has improved the quality and feasibility of many surgical procedures, including neurosurgery and ENT surgery. It is crucial that these systems provide measures of system performance in a manner which parallels the surgical process. The systems must provide fast and accurate registration processes to map physical space into image space, and must also include instruments which precisely indicate current surgical position. In addition, images must be displayed and updated in real time to allow ease in tracking structures across slices or surface renderings.

A Windows-based image-guided surgical system has been developed on a low cost personal computer with an active optical tracking system. The present invention was developed in Visual C++ with the Win32 API for windows management. Since the present invention was developed using a module library for each of the different components, code migration and maintenance is simplified. At any given time, a subtask contained within a particular DLL used in the system can be "checked out" of a Visual C++ SourceSafe database that stores the most recently edited master copy.

The present invention is capable of applications beyond neurosurgery, including open and minimally invasive hepatic procedures. Target errors on the order of 2 mm for the phantom registration studies were higher than those seen in clinical neurosurgical cases. This number is acceptable since the liver is much more homogeneous than the brain and higher accuracy is not as critical an issue.

The present invention may be implemented with any combination of hardware and software. If implemented as a computer-implemented apparatus, the present invention is implemented using means for performing all of the steps and functions described above.

The present invention can be included in an article of manufacture (e.g., one or more computer program products) having, for instance, computer useable media. The media has embodied therein, for instance, computer readable program code means for providing and facilitating the mechanisms of the present invention. The article of manufacture can be included as part of a computer system or sold separately.

It will be appreciated by those skilled in the art that changes could be made to the embodiments described above without departing from the broad inventive concept thereof. It is understood, therefore, that this invention is not limited to the particular embodiments disclosed, but it is intended to cover modifications within the spirit and scope of the present invention as defined by the appended claims.

Galloway, Jr., Robert L., Chapman, William C., Stefansic, James D., Herline, Alan J., Pinson, Candice D.

Patent Priority Assignee Title
10080617, Jun 27 2011 Board of Regents of the University of Nebraska On-board tool tracking system and methods of computer assisted surgery
10105149, Mar 15 2013 Board of Regents of the University of Nebraska On-board tool tracking system and methods of computer assisted surgery
10140704, Feb 22 2012 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
10165928, Aug 20 2010 Veran Medical Technologies Systems, instruments, and methods for four dimensional soft tissue navigation
10219811, Jun 27 2011 Board of Regents of the University of Nebraska On-board tool tracking system and methods of computer assisted surgery
10231739, Aug 28 2001 Bonutti Skeletal Innovations LLC System and method for robotic surgery
10249036, Feb 22 2012 Veran Medical Technologies, Inc Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
10264947, Aug 20 2010 Veran Medical Technologies, Inc Apparatus and method for airway registration and navigation
10321918, Aug 28 2001 Bonutti Skeletal Innovations LLC Methods for robotic surgery using a cannula
10460437, Feb 22 2012 Veran Medical Technologies, Inc Method for placing a localization element in an organ of a patient for four dimensional soft tissue navigation
10470725, Aug 11 2003 Veran Medical Technologies, Inc. Method, apparatuses, and systems useful in conducting image guided interventions
10470780, Aug 28 2001 Bonutti Skeletal Innovations LLC Systems and methods for ligament balancing in robotic surgery
10575906, Sep 26 2012 Stryker Corporation Navigation system and method for tracking objects using optical and non-optical sensors
10617324, Apr 23 2014 Veran Medical Technologies, Inc Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
10617332, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
10624701, Apr 23 2014 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
10799300, Oct 18 2018 Warsaw Orthopedic, Inc. Spinal implant system and method
10898057, Aug 20 2010 Veran Medical Technologies, Inc. Apparatus and method for airway registration and navigation
10977789, Feb 22 2012 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
11109740, Aug 20 2010 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
11116574, Jun 16 2006 Board of Regents of the University of Nebraska Method and apparatus for computer aided surgery
11154283, Aug 11 2003 Veran Medical Technologies, Inc. Bodily sealants and methods and apparatus for image-guided delivery of same
11304629, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
11304630, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
11403753, Feb 22 2012 Veran Medical Technologies, Inc. Surgical catheter having side exiting medical instrument and related systems and methods for four dimensional soft tissue navigation
11426134, Aug 11 2003 Veran Medical Technologies, Inc. Methods, apparatuses and systems useful in conducting image guided interventions
11529198, Sep 26 2012 Stryker Corporation Optical and non-optical sensor tracking of objects for a robotic cutting system
11551359, Feb 22 2012 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
11553968, Apr 23 2014 Veran Medical Technologies, Inc. Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
11642177, Oct 18 2018 Warsaw Orthopedic, Inc. Spinal implant system and method
11690527, Aug 20 2010 Veran Medical Technologies, Inc. Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
11830198, Feb 22 2012 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
11857265, Jun 16 2006 of Nebraska Method and apparatus for computer aided surgery
11911117, Jun 27 2011 of Nebraska On-board tool tracking system and methods of computer assisted surgery
6717609, Jan 11 2000 PENTAX Corporation Electronic endoscope selector and electronic endoscope system
7043961, Jan 30 2001 MAKO SURGICAL CORP Tool calibrator and tracker system
7072707, Jun 27 2001 Vanderbilt University Method and apparatus for collecting and processing physical space data for use while performing image-guided surgery
7388972, Sep 24 2003 Meridian Technique Limited Orthopaedic surgery planning
7398116, Aug 11 2003 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
7444178, Oct 05 2004 Brainlab AG Positional marker system with point light sources
7498811, Nov 16 2005 United States Government Apparatus and method for patient movement tracking
7771436, Dec 10 2003 STRYKER EUROPEAN HOLDINGS III, LLC Surgical navigation tracker, system and method
7797030, Nov 13 2003 Medtronic, Inc Clinical tool for structure localization
7853307, Aug 11 2003 Veran Medical Technologies, Inc. Methods, apparatuses, and systems useful in conducting image guided interventions
7873400, Dec 10 2003 STRYKER EUROPEAN HOLDINGS III, LLC Adapter for surgical navigation trackers
7911207, Nov 16 2005 BOARD OF REGENTS OF THE UNIVERSITY OF TEXAS SYSTEM, THE; US GOVT AS REPRESENTED BY THE DEPT OF VETERANS AFFAIRS Method for determining location and movement of a moving object
7912258, Sep 27 2005 Vanderbilt University Method and apparatus for standardizing ultrasonography training using image to physical space registration of tomographic volumes from tracked ultrasound
7920909, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
7977942, Nov 16 2005 VETERANS AFFAIRS, US GOVT AS REPRESENTED BY THE DEPT OF; BOARD OF REGENTS OF THE UNIVERSITY OF TEXAS SYSTEM, THE Apparatus and method for tracking movement of a target
8116549, Sep 27 2005 Vanderbilt University Method and apparatus for standardizing ultrasonography training using image to physical space registration of tomographic volumes from tracked ultrasound
8150495, Aug 11 2003 Veran Medical Technologies, Inc Bodily sealants and methods and apparatus for image-guided delivery of same
8358818, Nov 16 2006 Vanderbilt University Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same
8368647, Aug 01 2007 Three-dimensional virtual input and simulation apparatus
8390291, May 19 2008 The Board of Regents, The University of Texas System; The United States of America, as represented by the Depatment of Veterans Affairs Apparatus and method for tracking movement of a target
8425522, Jan 14 2000 Bonutti Skeletal Innovations LLC Joint replacement method
8483801, Aug 11 2003 Veran Medical Technologies, Inc Methods, apparatuses, and systems useful in conducting image guided interventions
8623030, Aug 28 2001 Bonutti Skeletal Innovations LLC Robotic arthroplasty system including navigation
8632552, Aug 28 2001 Bonutti Skeletal Innovations LLC Method of preparing a femur and tibia in knee arthroplasty
8638328, Jan 05 2007 Landmark Graphics Corporation Systems and methods for visualizing multiple volumetric data sets in real time
8641726, Aug 28 2001 Bonutti Skeletal Innovations LLC Method for robotic arthroplasty using navigation
8696549, Aug 20 2010 Veran Medical Technologies Apparatus and method for four dimensional soft tissue navigation in endoscopic applications
8736600, Jun 06 2008 Landmark Graphics Corporation Systems and methods for imaging a three-dimensional volume of geometrically irregular grid data representing a grid volume
8768022, Nov 15 2007 Vanderbilt University Apparatus and methods of compensating for organ deformation, registration of internal structures to images, and applications of same
8781186, May 04 2010 Analogic Corporation System and method for abdominal surface matching using pseudo-features
8784495, Jan 14 2000 Bonutti Skeletal Innovations LLC Segmental knee arthroplasty
8834490, Aug 28 2001 Bonutti Skeletal Innovations LLC Method for robotic arthroplasty using navigation
8840629, Aug 28 2001 Bonutti Skeletal Innovations LLC Robotic arthroplasty system including navigation
8858557, Aug 28 2001 Bonutti Skeletal Innovations LLC Method of preparing a femur and tibia in knee arthroplasty
9008757, Sep 26 2012 Stryker Corporation Navigation system including optical and non-optical sensors
9060797, Aug 28 2001 Bonutti Skeletal Innovations LLC Method of preparing a femur and tibia in knee arthroplasty
9101443, Jan 14 2000 Bonutti Skeletal Innovations LLC Methods for robotic arthroplasty
9138165, Feb 22 2012 Veran Medical Technologies, Inc Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
9161799, Jan 28 2013 WARSAW ORTHPEDIC, INC Surgical implant system and method
9192459, Mar 16 2000 Bonutti Skeletal Innovations LLC Method of performing total knee arthroplasty
9218663, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for automatic image guided accuracy verification
9218664, Sep 13 2005 Veran Medical Technologies, Inc. Apparatus and method for image guided accuracy verification
9271804, Sep 26 2012 Stryker Corporation Method for tracking objects using optical and non-optical sensors
9498231, Jun 27 2011 Board of Regents of the University of Nebraska On-board tool tracking system and methods of computer assisted surgery
9661991, Aug 24 2005 Philips Electronics Ltd System, method and devices for navigated flexible endoscopy
9687307, Sep 26 2012 Stryker Corporation Navigation system and method for tracking objects using optical and non-optical sensors
9761014, Nov 15 2012 SIEMENS HEALTHINEERS AG System and method for registering pre-operative and intra-operative images using biomechanical model simulations
9763683, Aug 28 2001 Bonutti Skeletal Innovations LLC Method for performing surgical procedures using optical cutting guides
9795394, Aug 28 2001 Bonutti Skeletal Innovations LLC Method for placing implant using robotic system
9956020, Jan 28 2013 Warsaw Orthopedic, Inc. Surgical implant system and method
9972082, Feb 22 2012 Veran Medical Technologies, Inc Steerable surgical catheter having biopsy devices and related systems and methods for four dimensional soft tissue navigation
Patent Priority Assignee Title
5383454, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5647361, Dec 18 1992 Fonar Corporation Magnetic resonance imaging method and apparatus for guiding invasive therapy
5817105, May 29 1996 BANK OF MONTREAL Image-guided surgery system
5851183, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5871445, Apr 26 1993 ST LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
5891034, Oct 19 1990 ST LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
5970499, Apr 11 1997 SURGICAL NAVIGATION TECHNOLOGIES, INC Method and apparatus for producing and accessing composite data
5980535, Sep 30 1996 CLEVELAND CLINIC FOUNDATION, THE; PICKER INTERNATIONAL, INC Apparatus for anatomical tracking
5987960, Sep 26 1997 MAKO SURGICAL CORP Tool calibrator
6013087, Sep 26 1996 BANK OF MONTREAL Image-guided surgery system
6021343, Nov 20 1997 Medtronic Navigation, Inc Image guided awl/tap/screwdriver
6038467, Jan 24 1997 U S PHILIPS CORPORATION Image display system and image guided surgery system
6045532, Feb 20 1998 Arthrocare Corporation Systems and methods for electrosurgical treatment of tissue in the brain and spinal cord
6066123, Apr 09 1998 LELAND STANFORD JUNIOR UNIVERSITY, THE BOARD OF TRUSTEES OF THE Enhancement of bioavailability by use of focused energy delivery to a target tissue
6066134, Jan 07 1992 Arthrocare Corporation Method for electrosurgical cutting and ablation
6076008, Apr 26 1993 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
6112113, Jul 03 1997 U S PHILIPS CORPORATION Image-guided surgery system
6135946, Jun 23 1997 U S PHILIPS CORPORATION Method and system for image-guided interventional endoscopic procedures
6161033, Apr 17 1998 U.S. Philips Corporation Image guided surgery system
6167292, Jun 09 1998 SCHARER MAYFIELD NEUROMATE Registering method and apparatus for robotic surgery, and a registering device constituting an application thereof
6187018, Oct 27 1999 MAKO SURGICAL CORP Auto positioner
6195577, Oct 08 1998 Regents of the University of Minnesota Method and apparatus for positioning a device in a body
6236875, Oct 07 1994 SURGICAL NAVIGATION TECHNOLOGIES, INC ; ST LOUIS UNIVERSITY Surgical navigation systems including reference and localization frames
6259943, Feb 16 1995 INTEGRA BURLINGTON MA, INC Frameless to frame-based registration system
6374135, Oct 19 1990 SAINT LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
6490475, Apr 28 2000 STRYKER EUROPEAN HOLDINGS I, LLC Fluoroscopic tracking and visualization system
D387427, Feb 12 1996 ST LOUIS UNIVERSITY Ventriculostomy probe
D420132, Apr 29 1996 Surgical Navigation Technologies Drill guide
D422706, Apr 29 1996 Surgical Navigation Technologies Biopsy guide tube
///////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jun 18 2001GALLOWAY, JR , ROBERT L Vanderbilt UniversityASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0119630824 pdf
Jun 19 2001CHAPMAN, WILLIAM C Vanderbilt UniversityASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0119630824 pdf
Jun 19 2001STEFANSIC, JAMES D Vanderbilt UniversityASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0119630824 pdf
Jun 19 2001HERLINE, ALAN J Vanderbilt UniversityASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0119630824 pdf
Jun 25 2001PINSON, CANDICE D Vanderbilt UniversityASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0119630824 pdf
Jun 27 2001Vanderbilt University(assignment on the face of the patent)
Jun 14 2002Vanderbilt UniversityNATIONAL INSTITUTES OF HEALTH NIH , U S DEPT OF HEALTH AND HUMAN SERVICES DHHS , U S GOVERNMENTEXECUTIVE ORDER 9424, CONFIRMATORY LICENSE0212900702 pdf
Date Maintenance Fee Events
Dec 22 2006M2551: Payment of Maintenance Fee, 4th Yr, Small Entity.
Jan 04 2007LTOS: Pat Holder Claims Small Entity Status.
Sep 11 2009ASPN: Payor Number Assigned.
Nov 16 2010M2552: Payment of Maintenance Fee, 8th Yr, Small Entity.
Nov 20 2014M1553: Payment of Maintenance Fee, 12th Year, Large Entity.
Nov 21 2014STOL: Pat Hldr no Longer Claims Small Ent Stat
Feb 12 2015ASPN: Payor Number Assigned.
Feb 12 2015RMPN: Payer Number De-assigned.


Date Maintenance Schedule
Jun 24 20064 years fee payment window open
Dec 24 20066 months grace period start (w surcharge)
Jun 24 2007patent expiry (for year 4)
Jun 24 20092 years to revive unintentionally abandoned end. (for year 4)
Jun 24 20108 years fee payment window open
Dec 24 20106 months grace period start (w surcharge)
Jun 24 2011patent expiry (for year 8)
Jun 24 20132 years to revive unintentionally abandoned end. (for year 8)
Jun 24 201412 years fee payment window open
Dec 24 20146 months grace period start (w surcharge)
Jun 24 2015patent expiry (for year 12)
Jun 24 20172 years to revive unintentionally abandoned end. (for year 12)