An interactive system for a local intervention inside a region of a non-homogeneous structure, such as the skull of a patient, which is related to the frame of reference (R2) of an operation table, and which is connected to a reference structure comprising a plurality of base points. The system creates on a screen a representation of the non-homogeneous structure and of the reference structure connected thereto, provides the coordinates of the images of the base points in the first frame of reference (R1), allows the marking of the coordinates of the base points in R2, and allows the carrying out of the local intervention with an active member such as a trephining tool, a needle, or a radioactive or chemical implant. The systems also optimizes the transfer of reference frames between R1 and R2, from the coordinates of the base points in R2 and the images in R1 by reducing down to a minimum the deviations between the coordinates of images in R1 and the base points in R1 after transfer. The system also establishes real time bi-directional coupling between: (1) an origin and a direction of intervention simulated on the screen, (2) the position of the active member.

Patent
   RE43952
Priority
Oct 05 1989
Filed
Oct 05 1990
Issued
Jan 29 2013
Expiry
Jan 29 2030
Assg.orig
Entity
unknown
5
603
EXPIRED
0. 67. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient; and
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
wherein the device is operable to construct three-dimensional images from captured two-dimensional images;
wherein the controller is operable to superimpose two-dimensional image data on the three-dimensional images wherein any change in soft external parts of the patient can be visualized as compared with the image captured by the imaging device.
0. 68. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame; and
an active member operable to perform the intervention;
wherein the device includes a display operable to display the image data of the region of the patient in relation to the image reference frame;
wherein the controller is further operable to determine residual uncertainty which is used to represent a contour with dimensions larger than those which would normally be represented and the display is operable to display the residual uncertainty of the contour.
0. 84. A method for performing an image guided intervention inside a region of a patient, said method comprising:
accessing a first image data of the region of the patient captured with an imaging system where the first image data includes image data of a first reference structure;
identifying the first reference structure in the first image data to establish an image reference frame;
identifying a second reference structure relative to the patient to establish a patient reference frame;
correlating the position of the first reference structure in the image reference frame in the first image data with the position of the second reference structure in the patient reference frame; and
tracking an active member at least to determine a position of the active member in the patient reference frame to determine a location of the active member based on the tracking of the active member and transmitting the determined position in the patient refrence frame for display on a display device relative to the image reference frame of the first image data based at least on the correlation of the first reference structure and the second reference structure.
0. 17. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
an active member operable to perform the intervention; and
a tracking system operable to determine a position of at least the second reference structure and a position of the active member and configured to transmit the determined positions of the second reference structure and the active member to the controller;
wherein the controller is configured to determine the position of the active member based on the determined position of at least the active member and the correlation of the first reference structure and the second reference structure.
0. 70. An interactive system for intervention inside a region of a patient, said interactive system comprising:
a device operable to receive image data of the region of the patient, wherein the image data includes image data of a first reference structure to establish an image reference frame for the region of the patient;
a second reference structure positioned relative to the patient to establish a patient reference frame for the region of the patient;
a controller operable to correlate the position of the first reference structure in the image reference frame with the position of the second reference structure in the patient reference frame;
an active member operable to perform the intervention inside the region of the patient;
a tracking system operable to track the position of the active member in relation to the patient reference frame, the tracking system being in communication with the controller to transmit the tracked position of the active member as position information to the controller, wherein the controller is operable to determine the position of the active member relative to the image reference frame; and
a display operable to display the real-time position of the active member in the image reference frame based on the controller determined position of the active member based on the tracked position of the active member from the tracking system, wherein the controller is configured to generate a representation of the active member that is displayed on the display relative to a display of the received image data.
1. An interactive system for local intervention inside a region of a non-homogeneous structure to which is connected a reference structure containing a plurality of base points, the interactive system comprising:
means for dynamically displaying a three-dimensional image of a representation of the non-homogeneous structure and of the reference structure connected to the non-homogeneous structure, wherein the three-dimensional image also includes a plurality of images of the plurality of base points;
means for determining a set of coordinates of the plurality of images of the plurality of base points in a first reference frame;
means for fixing a position of the non-homogeneous structure and of the reference structure with respect to a second reference frame;
means for determining a set of coordinates of the plurality of base points in the second reference frame;
means of intervention comprising an active member whose position is determined with respect to the second reference frame;
means for generating a plurality of reference frame translation tools for translating a plurality of reference frames from the first reference frame to the second reference frame and vice versa, based on the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and of the set of coordinates of the plurality of base points in the second reference frame, in such a way as to reduce to a minimum at least one of a set of deviations between the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and the set of coordinates of the base points, expressed in the first reference frame using the plurality of reference frame translation tools;
means for defining, with respect to the first reference frame, a simulated origin of intervention and a simulated direction of intervention; and,
means for transferring the plurality of reference frames using the plurality of reference frame translation tools to establish a bidirectional coupling between the simulated origin of intervention and the simulated direction of intervention and the position of the active member.
2. The interactive system according to claim 1, wherein the plurality of reference frame translation tools comprise:
means for creating a matrix (M) for transferring between the first reference frame and a first intermediate reference frame based on a set of coordinates of a set of three images of a set of three base points of the reference structure;
means for creating a matrix (N) for transferring between the second reference frame and a second intermediate reference frame based on the set of coordinates of the set of three images of the set of three base points of the reference structure; and,
means for validating matrix (M) and matrix (N) based on the set of three base points and the set of three images, such that at least one deviation between an expression for at least one additional base point in the second intermediate reference frame and an expression for at least one image point of the additional base point in the first intermediate reference frame is reduced to a minimum.
3. The interactive system according to plurality of claim 2, wherein the means for transferring the reference frames using the plurality of reference frame translation tools further comprises:
a first transfer sub-module for transferring a set of representation/non-homogeneous structure coordinates, and
a second transfer sub-module for transferring a set of non-homogeneous structure/representation coordinates.
4. The interactive system according to claim 3, wherein the first transfer sub-module comprises:
means for acquiring a set of coordinates (XM, YM, ZM), expressed in the first reference frame, of a point of the representation of the non-homogeneous structure to be transferred, by selection on the representation;
means for calculating a set of corresponding coordinates (XP, YP, ZP), expressed in the second reference frame, on the non-homogeneous structure through a transformation:
{YP,YP, ZP}=M*N−1 *{XM,YM,ZM} where M * N−1 represents a product of the matrix (M) and an inverse of the matrix (N), and
means for processing, with the aid of the corresponding coordinates (YP, YP, ZP), to display a corresponding point on a surface of the non-homogeneous structure and to secure the intervention.
5. The interactive system according to claim 3, wherein the second transfer sub-module comprises:
means for acquiring a set of coordinates (XP, YP, ZP), expressed in the second reference frame, of a point of the non-homogeneous structure to be transferred;
means for calculating a set of corresponding coordinates (XM YM, ZM), expressed in the first reference frame, of the representation through a transformation:
{YM, YM, ZM}=N*M−1 *{XP,ZP,ZP} where N*M−1 represents the product of the matrix (N) and an inverse of the matrix (M); and,
means for displaying the representation using the set of corresponding coordinates (YM, YM, ZM).
6. The interactive system according to claim 1, wherein the means for generating the plurality of reference frame translation tools also generate, in association with the reference frame translation tools, tools for taking into account a residual uncertainty which is based on the set of deviations between the set of coordinates of the plurality of images of the plurality of base points in the first reference frame and the set of coordinates of the base points, the tools for taking into account the residual uncertainty usable for displaying a set of contours in the representation whilst taking into account the residual uncertainties.
7. The interactive system according to claim 1, wherein the means of dynamic displaying the three-dimensional image comprises:
a file containing digitized data from a set of two-dimensional images constituted by successive non-invasive tomographic sections of the non-homogeneous structure;
means for calculating and reconstructing the three-dimensional image from the set of two-dimensional images; and
a high-resolution display screen.
8. The interactive system according to claim 7, wherein the means for calculating and reconstructing the three-dimensional image from the set of two-dimensional images comprises a program consisting of computer-aided design type software.
9. The interactive system according to claim 1, wherein the means for determining the set of coordinates of the plurality of base points in the second reference frame comprises a three-dimensional probe equipped with a tactile tip for delivering a set of coordinates of the tactile tip in the said second reference frame.
10. The interactive system according to claim 1, wherein the means for determining the set of coordinates of the plurality of base points is the second reference frame comprises at least one of a set of optical sensors and a set of electromagnetic sensors.
11. The interactive system according to claim 1, wherein a portion of the set of the plurality of base points of the reference structure comprises a plurality of marks positioned on a lateral surface of the non-homogeneous structure.
12. The interactive system according to claim 11, wherein the plurality of marks are four in number and are distributed over the lateral surface so as to define a substantially symmetrical tetrahedron.
13. The interactive system according to claim 1, wherein the means of intervention comprises:
a guide arm to secure intervention in the region of the non-homogeneous structure, the guide arm having a position marked with respect to the second reference frame; and,
an active intervention member whose position is marked with respect to the second reference frame.
14. The interactive system according to claim 13, wherein the active intervention member is removable and selected from the group consisting of:
tools for trephining;
needles and implants;
laser and radioisotope emission heads; and, sighting and viewing systems.
15. The interactive system according to claim 1, wherein the means for transferring the plurality of reference frames establishes a coupling between a direction of visualization of the representation of the non-homogeneous structure on the display means and a direction of observation of the non-homogeneous structure and of the reference structure by the active intervention member.
16. The interactive system according to claim 15, further comprising:
a first module for visualizing a representation in a direction given by two points;
a second module for visualizing a representation in a direction given by an angle of elevation and an angle of azimuth.
0. 18. The interactive system as defined in claim 17 wherein the first reference structure includes a plurality of base points.
0. 19. The interactive system as defined in claim 18 wherein the second reference structure includes a plurality of tracking markers.
0. 20. The interactive system as defined in claim 19 wherein the plurality of base points are generated from the plurality of tracking markers.
0. 21. The interactive system as defined in claim 18 wherein the plurality of base points are at least one of a plurality of notable points on the patient and marks fixed to the patient.
0. 22. The interactive system as defined in claim 21 wherein the notable points are selected from a group comprising a head, eyebrows, temples, frontal medial point, an apex of a skull, a center of gravity of an orbits of the eyes and a combination thereof.
0. 23. The interactive system as defined in claim 18 wherein the controller further includes a graphical tool operable to identify the plurality of base points of the first reference structure in the image data of the image data reference frame.
0. 24. The interactive system as defined in claim 23 wherein the graphical tool is a mouse in communication with the controller.
0. 25. The interactive system as defined in claim 17 wherein the second reference structure includes a plurality of tracking markers.
0. 26. The interactive system as defined in claim 25 wherein the plurality of tracking markers are attached to the patient.
0. 27. The interactive system as defined in claim 17 wherein the second reference structure is attached to the patient.
0. 28. The interactive system as defined in claim 17 wherein the first reference structure is attached to the patient.
0. 29. The interactive system as defined in claim 17 wherein the tracking system includes a marker device operable to determine a position of the second reference structure in relation to the patient reference frame.
0. 30. The interactive system as defined in claim 29 wherein the marker device includes a telemetry system operable to determine the position of the second reference structure in the patient reference frame and transmit the determined position to the controller, wherein the controller is operable to perform the correlation at least with the transmitted determined position.
0. 31. The interactive system as defined in claim 30 wherein the telemetry system is an electromagnetic telemetry system.
0. 32. The interactive system as defined in claim 31 wherein the second reference structure includes electromagnetic tracking markers, wherein the electromagnetic telemetry system is operable to determine the position of the electromagnetic tracking markers of the second reference structure in relation to the patient reference frame.
0. 33. The interactive system as defined in claim 32, wherein the electromagnetic tracking markers are transmitters and the electromagnetic telemetry system is an electromagnetic sensor.
0. 34. The interactive system as defined in claim 30 wherein the telemetry system is an optical telemetry system.
0. 35. The interactive system as defined in claim 34 wherein the optical telemetry system includes at least one of a video camera or an infrared camera to image at least the second reference structure and configured to plot points of the second reference structure.
0. 36. The interactive system as defined in claim 34 wherein the second reference structure includes optical tracking markers, wherein the optical telemetry system is operable to determine the position of the optical tracking markers of the second reference structure in relation to the patient reference frame.
0. 37. The interactive system as defined in claim 34 wherein the optical telemetry system utilizes position and shape recognition to identify the second reference structure.
0. 38. The interactive system as defined in claim 29 wherein the marker device includes a three-dimensional probe.
0. 39. The interactive system as defined in claim 38 wherein the three-dimensional probe includes a tactile tip operable to engage the second reference structure.
0. 40. The interactive system as defined in claim 38 wherein the three-dimensional probe is robotically manipulated, such that the instantaneous position of the three-dimensional probe is known.
0. 41. The interactive system as defined in claim 29 wherein the marker device includes a set of cameras operable to determine the position of the second reference structure in relation to the patient reference frame.
0. 42. The interactive system as defined in claim 41 wherein the set of cameras are selected from video and infrared cameras.
0. 43. The interactive system as defined in claim 29 wherein the marker device is a laser beam emission system operable to illuminate the second reference structure to determine a position of the second reference structure in relation to the patient reference frame.
0. 44. The interactive system as defined in claim 17 wherein the first reference structure is generated from the second reference structure.
0. 45. The interactive system as defined in claim 17 wherein the active member is selected from a group comprising a trephining tool, a needle, a laser, a radioscope emission head, an endoscopic viewing system, a tool used in the intervention, an implant, a sighting system, a microscope, and combinations thereof.
0. 46. The interactive system as defined in claim 17 further comprising a telemetry system operable to determine the position of the active member in the patient reference frame, said telemetry system in communication with the controller.
0. 47. The interactive system as defined in claim 46 wherein the position information of the active member is six degree of freedom information in relation to the patient reference frame.
0. 48. The interactive system as defined in claim 17 wherein the device includes a display operable to display the image data of the region of the patient in relation to the image reference frame.
0. 49. The interactive system as defined in claim 48 wherein the controller is further operable to determine a reference origin of intervention and a direction of intervention and said display is further operable to display the reference origin of intervention and direction of intervention.
0. 50. The interactive system as defined in claim 48 wherein the controller is further operable to model a reference origin of intervention and a direction of intervention and said display is further operable to display the modeled reference origin of intervention and direction of intervention.
0. 51. The interactive system as defined in claim 48 wherein the display is further operable to display the real-time position of the active member in the image reference frame based on the determined position of the active member with the tracking system.
0. 52. The interactive system as defined in claim 48 wherein the display is further operable to display image data relative to a direction of intervention of the active member.
0. 53. The interactive system as defined in claim 52 wherein the image data is displayed perpendicular to a direction of intervention of the active member.
0. 54. The interactive system as defined in claim 48 wherein the controller is further operable to simulate an optimal trajectory of advance of the active member and said display is operable to display the optimal trajectory in the image data relative to the image reference frame.
0. 55. The interactive system as defined in claim 54 wherein movement of the active member is steered to the optimal trajectory to carry out a programmed intervention.
0. 56. The interactive system as defined in claim 17 wherein the active member is robotically controlled.
0. 57. The interactive system as defined in claim 17 wherein the image data is at least one of a magnetic resonance image data, a tomographic image data, a radiographic image data, x-ray image data, and combinations thereof.
0. 58. The interactive system as defined in claim 57 wherein the head set is further fixed to an operating table.
0. 59. The interactive system as defined in claim 17 wherein the device is operable to construct three-dimensional images from captured two-dimensional images.
0. 60. The interactive system as defined in claim 17 wherein the controller is further operable to correlate map data in a map reference frame with the patient reference frame.
0. 61. The interactive system as defined in claim 17 wherein the intervention is at least one of a neurosurgery, orthopedic surgery, cranial surgery, and combinations thereof.
0. 62. The interactive system as defined in claim 17 wherein the second reference structure is fixed to a head set.
0. 63. The interactive system as defined in claim 17 wherein the device further includes memory operable to store the image data.
0. 64. The interactive system as defined in claim 17 wherein the device is a first computer.
0. 65. The interactive system as defined in claim 64 wherein the controller is a second computer.
0. 66. The interactive system as defined in claim 65 wherein the first computer and the second computer is a single work station.
0. 69. The interactive system as defined in claim 68 wherein the contour is a display of an active member and a representation of residual uncertainty in order to reduce the chance of traversing undesired structures.
0. 71. The interactive system as defined in claim 70 wherein the active member is selected from a group comprising a trephining tool, a needle, a laser, a radioscope emission head, an endoscopic viewing system, a tool used in the intervention, an implant, a sighting system, a microscope, and combinations thereof.
0. 72. The interactive system as defined in claim 70 wherein the position information of the active member is six degree of freedom information in relation to the patient reference frame.
0. 73. The interactive system as defined in claim 70 wherein the tracking system that tracks the position of the active member is a telemetry system in communication with the controller.
0. 74. The interactive system as defined in claim 70 wherein the active member is robotically controlled.
0. 75. The interactive system as defined in claim 70 wherein the image data is at least one of a magnetic resonance image data, a tomographic image data, a radiographic image data, x-ray image data, and combinations thereof.
0. 76. The interactive system as defined in claim 70 wherein the controller is further operable to determine a reference origin of intervention and a direction of intervention and said display is further operable to display the reference origin of intervention and direction of intervention.
0. 77. The interactive system as defined in claim 70 wherein the first reference structure includes a plurality of base points.
0. 78. The interactive system as defined in claim 77 wherein the second reference structure includes a plurality of tracking markers.
0. 79. The interactive system as defined in claim 78 wherein the plurality of base points are generated by the plurality of tracking markers.
0. 80. The interactive system as defined in claim 70 wherein the second reference structure is attached to the patient.
0. 81. The interactive system as defined in claim 70 wherein intervention is at least one of a neurosurgery, orthopedic surgery, cranial surgery intervention, and combinations thereof.
0. 82. The interactive system as defined in claim 70 wherein the second reference structure is fixed to a head set.
0. 83. The interactive system as defined in claim 70 wherein the display forms part of the device and wherein the image data received is acquired image data of the region of the patient and is displayed on the display, further wherein the representation of the active member is displayed on the acquired image data of the region of the patient.
0. 85. The method as defined in claim 84 further comprising attaching a plurality of tracking markers to the patient where the tracking markers form the second reference structure.
0. 86. The method as defined in claim 85 further comprising identifying the position of the tracking markers in the patient reference frame using a telemetry system.
0. 87. The method as defined in claim 86 further comprising transmitting from the tracking markers a signal and receiving the transmitted signal with an electromagnetic sensor to identify the position of the second reference structure in the patient reference frame.
0. 88. The method as defined in claim 84 wherein identifying the first reference structure includes identifying a plurality of base points visible in the image data.
0. 89. The method as defined in claim 88 wherein identifying the plurality of base points includes identifying at least one of notable points on the patient as marks fixed to the patient representing the plurality of base points.
0. 90. The method as defined in claim 89 wherein the notable points are selected from a group comprising a head, eyebrows, temporal point, frontal medial point, an apex of a skull, a center of gravity of an orbits of the eyes and a combination thereof.
0. 91. The method as defined in claim 88 wherein the plurality of base points visible in the image data are generated from the plurality of tracking markers attached to the patient.
0. 92. The method as defined in claim 84 further comprising attaching the second reference structure to the patient.
0. 93. The method as defined in claim 92 further comprising attaching the second reference structure to a head set.
0. 94. The method as defined in claim 84 further comprising displaying the image data of the region of the patient, including displaying the first reference structure.
0. 95. The method as defined in claim 94 further comprising:
displaying the position of the active member as a representation of the active member in the accessed first image data that is captured image data that is correlated to the patient based on the correlation and displayed on a display device with the position of the active member being correlated between the patient reference frame defined by the first reference structure fixed to the patient and the image reference frame based on the tracking of the active member.
0. 96. The method as defined in claim 95 further comprising identifying the position of the active member with a telemetry system by transmitting the tracked location of the active member for displaying the representation of the active member.
0. 97. The method as defined in claim 95 further comprising displaying a reference origin of intervention and a direction of intervention in the image data.
0. 98. The method as defined in claim 97 further comprising tracking the position of the active member relative to the reference origin of intervention and the direction of intervention.
0. 99. The method as defined in claim 84 further comprising performing an intervention on the patient with an active member.
0. 100. The method as defined in claim 99 wherein the intervention is selected from at least one of a neurosurgery, orthopedic surgery, cranial surgery, and combinations thereof.
0. 101. The method as defined in claim 84 further comprising converting two-dimensional image data to three-dimensional image data.

The invention relates to an interactive system for local intervention inside a region of a nonhomogeneous structure.

The performing of local interventions inside a nonhomogeneous structure, such as intracranial surgical operations or orthopedic surgery currently poses the problem of optimizing the intervention path or paths so as to secure, on the one hand, total intervention over the region or structure of interest, such as a tumor to be treated or explored and, on the other hand, minimal lesion to the regions neighboring or adjoining the region of interest, this entailing the localizing and then the selecting of the regions of the nonhomogeneous structure which are least sensitive to being traversed or the least susceptible to damage as regards the integrity of the structure.

Numerous works aimed at providing a solution to the abovementioned problem have hitherto been the subject of publications. Among the latter may be cited the article entitled “Three Dimensional Digitizer (Neuronavigator): New Equipment for computed Tomography Guided Stereotaxic Surgery”, published by Eiju Watanabe, M.D., Takashi Watanabe, M.D., Shinya Manaka, M.D., Yoshiaki Mayanagi, M.D., and Kintomo Takakura, M.D. Department of Neurosurgery, Faculty of Medicine, University of Tokyo, Japan, in the journal Surgery Neurol. 1987: 27 pp. 543-547, by Elsevier Science Publishing Co., Inc. The Patent WO-A-88 09151 teaches a similar item of equipment.

In the abovementioned publications are described in particular a system and an operational mode on the basis of which a three-dimensional position marking system, of the probe type, makes it possible to mark the three-dimensional position coordinates of a nonhomogeneous structure, such as the head of a patient having to undergo a neurosurgical intervention, and then to put into correspondence as a function of the relative position of the nonhomoc.eneous structure a series of corresponding images consisting of two-dimensional images sectioned along an arbitrary direction, and obtained previously with the aid of a medical imaging method of the “scanner” type.

The system and the operational mode mentioned above offer a sure advantage for the intervening surgeon since the latter has available, during the intervention, apart from a direct view of the intervention, at least one two-dimensional sectional view enabling him to be aware, in the sectional plane, of the state of performance of the intervention.

However, and by virtue of the very design of the system and of the operational mode mentioned above, the latter allow neither a precise representation of the state of performance of the intervention, nor partially or totally automated conduct of the intervention in accordance with a program for advance of the instrument determined prior to the intervention.

Such a system and such an operational mode cannot therefore claim to eradicate all man-made risk, since the intervention is still conducted by the surgeon alone.

The objective of the present invention is to remedy the whole of the problem cited earlier, and in particular to propose a system permitting as exact as possible a correlation, at any instant, between an intervention modeling on the screen and the actual intervention, and furthermore the representation from one or more viewing angles, and if appropriate in one or more sectional planes, of the nonhomogeneous structure, the sectional plane or planes possibly being for example perpendicular to the direction of the path of advance of the instrument or of the intervention tool.

Another objective of the present invention is also the implementation of a system permitting simulation of an optimal trajectory of advance of the tool, so as to constitute an assisted or fully programed intervention.

Finally, an objective of the present invention is to propose a system making it possible, on the basis of the simulated trajectory and of the programed intervention, to steer the movement of the instrument or tool to the said trajectory so as to carry out the programed intervention.

The invention proposes to this effect an interactive system for local intervention inside a region of a nonhomogeneous structure to which is tied a reference structure containing a plurality of base points, characterized in that it comprises:

A more detailed description of the system of the invention will be given below with reference to the drawings in which:

FIG. 1 represents a general view of an interactive system for local intervention inside a region of a nonhomogeneous structure according to the present invention,

FIG. 2 represents, in the case where the nonhomogeneous Structure consists of the head of a patient, and with a view to a neurosurgical intervention, a reference structure tied to the nonhomogeneous structure and enabling a correlation to be established between a “patient” reference frame and a reference frame of images of the patient which were made and stored previously,

FIG. 3 represents an advantageous embodiment of the spacial distribution of the reference structure of FIG. 2,

FIG. 4 preesents an advantageous embodiment of the intervention means set up on an operating table in the case of a neurosirgical intervention,

FIGS. 5a and 5b represent a general flow diagram of functional steps implemented by the system,

FIGS. 6 thru 8 represent flow diagrams of programs permitting implementation of certain functional steps of FIG. 5b,

FIG. 9a represents a flow diagram of a program permitting implementation of a functional step of FIG. 5a,

FIG. 9b represents a flow diagram of a program permitting implementation of another functional step of FIG. 5a,

FIGS. 10a and 10b represent a general flow diagram of the successive steps of an interactive dialogue between the system of the present invention and the intervening surgeon and

FIG. 10c represents a general flow diagram of the successive functional steps carried out by the system of the invention, having (sic) the intervention, prior to the intervention, during the intervention and after the intervention.

The interactive system for local intervention according to the invention will firstly be described in connection with FIG. 1.

A nonhomogeneous structure, denoted SNH, on which an intervention is to be performed, consists for example of the head of a patient in which a neurosurgical intervention is to be performed. It is however understood that the system of the invention can be used to carry out any type of intervention in any type of nonhomogeneous structure inside which structural and/or functional elements or units may be in evidence and whose integrity, during the intervention, is to be respected as far as possible.

The system comprises means, denoted 1, of dynamic display by three-dimensional imaging, with respect to a first reference frame R1, of a representation (denoted RSR) of a reference structure SR (described later) tied to the structure SNH, and a representation or modeling of the nonhomogeneous structure, denoted RSNH.

More precisely, the means 1 make it possible to display a plurality of successive three-dimensional images, from different angles, of the representations RSNH and RSR.

The system of the invention also comprises means, denoted 2, of tied positioning, with respect to a second reference frame R2, of the structures SNH and SR.

In the present non-limiting example, the head of the patient, bearing the reference structure SR, is fixed on an operating table TO to which are fixed the means 2 of tied positioning.

Of course, the patient whose head has been placed in the means 2 for tied positioning has previously been subjected to the customary preparations, in order to enable him to undergo the intervention.

The means 2 of the tied positioning with respect to R2 will not be described in detail since they can consist of any means (such as a retaining headset) normally used in the field of surgery or neurosurgery. The reference frame R2 can arbitrarily be defined as a tri-rectangular reference trihedron tied to the operating table TO, as represented in FIG. 1.

Means 3 of marking, with respect to the second reference frame R2, the coordinates, denoted X2, Y2, Z2, of arbitrary points, and in particular of a certain number of base points of the reference structure SR are furthermore provided.

These base points constituting the reference structure SR can consist of certain notable points and/or of marks fixed to the patient, at positions selected by the surgeon and in particular at these notable points.

The system of the invention further comprises computing means 4 receiving means 3 of marking the coordinates X2, X2, Z2.

The computing means 4, as will be seen in detail later, are designed to elaborate optimal tools for reference frame transfer using on the one hand the coordinates in R2, measured by the probe 3, of a plurality of base points of the structure SR, and on the other hand the coordinates in R1, determined by graphical tools of the computer M01 (pointing by mouse, etc.), of the images of the corresponding base points in the representation RSR, so as to secure the best possible correlation between the information modeled in the computer equipment and the corresponding real-world information.

There is furthermore provision for reference frame transfer means 11 designed to use the tools thus elaborated and to secure this correlation in real time.

Moreover, means 40 are provided, as will be seen in detail later, for determining or modeling a reference origin of intervention ORI and a direction of intervention Δ.

With the aid of the means 11, the modeled direction of intervention Δ can, at least prior to the intervention and at the start of the intervention, be materialized through an optical sighting system available to the surgeon, it being possible to steer this sighting system positionally with respect to the second reference frame R2.

The sighting system will be described later.

The system of the present invention finally comprises means 5 of intervention comprising an active member, denoted 50, whose position is specified with respect to the second reference frame R2. The active member can consist of the various tools used in surgical intervention. For example, in the case of an intercranial neurosurgical intervention, the active member could be a trephining tool, a needle, a laser or radioscope emission head, or an endoscopic viewing system.

According to an advantageous characteristic of the invention, by virtue of the reference frame transfer means 11, the position of the active member can be controlled dynamically on the basis of the prior modeling of the origin of intervention ORI and of the direction of intervention Δ.

The means 1 of dynamic display by three-dimensional imaging of the representations RSNH and RSR comprise a file 10 of two-dimensional image data. The file 10 consists for example of digitized data from tomographic sections, from radiographs, from maps of the patient's head, and contained in an appropriate mass memory.

The successive tomographic sections can be produced prior to the intervention in a conventional manner, after the reference structure SR has been put in place on the nonhomogeneous structure SNH.

According to an advantageous feature, the reference structure SR can consist of a plurality of marks or notable points which can be both sensed by the marker means 3 and detected on the two-dimensional images obtained.

Of course, the abovementioned two-dimensional tomographic sections can likewise be produced by any medical imaging means such as a nuclear magnetic resonance system.

In a characteristic and well-known manner, each two-dimensional image corresponding to a tomographic scanner section corresponds to a structural slice thickness of about 2 to 3 mm, the pixels or image elements in the plane of the tomographic section being obtained with a precision of the order of ±1 mm. It is therefore understood that the marks or points constituting the reference structure SR appear on the images with a positional uncertainty, and an important feature of the invention will consist in minimizing these uncertainties as will be described later.

The system also comprises first means 110 for calculating and reconstructing three-dimensional images from the data from the file 10.

It also comprises a high-resolution screen 12 permitting the displaying of one or more three-dimensional or two-dimensional images constituting so many representations of the reference structure RSR and of the nonhomogeneous structure SNH.

Advantageously, the calculating means 110, the high-resolution screen and the mass memory containing the file 10 form part of a computer of the workstation type with conventional design and denoted MO1.

Preferably, the first calculating means 110 can consist of a CAD type program installed in the workstation MO1.

By way of non-limiting example, this program can be derived from the software marketed under the tradename “AUTOCAD” by the “AUTODESK” company in the United States of America.

Such software makes it possible, from the various digitized two-dimensional images, to reconstruct threedimensional images constituting the representations of the structures RSR and RSNH in arbitrary orientations.

Thus, as has furthermore been represented in FIG. 1, the calculating means 4 and 11 can consist of a third computer, denoted MO2 in FIG. 1.

The first and second computers MO1 and MO2 are interconnected by a conventional digital link (bus or network).

As a variant, the computers MO1 and MO2 can be replaced by a single workstation.

The marker means 3 consist of a three-dimensional probe equipped with a tactile tip 30.

This type of three-dimensional probe, known per se and not described in detail, consists of a plurality of hinged arms, marked in terms of position with respect to a base integral with the operating table TO. It makes it possible to ascertain the coordinates of the tactile tip 30 with respect to the origin O2 of the reference frame R2 with a precision better than 1 mm.

The probe is for example equipped with resolvers delivering signals representing the instantaneous position of the abovementioned tactile tip 30. The resolvers are themselves connected to the circuits for digital/analog conversion and sampling of the values representing these signals, these sampling circuits being interconnected in conventional manner to the second computer MO2 in order to supply it with the coordinates X2, X2, Z2 of the tactile tip 30.

As a variant or additionally, and as represented diagrammatically, the marker means 3 can comprise a set of video cameras 31 and 32 (or else infrared cameras) enabling pictures to be taken of the structures SNH and SR.

The set of cameras can act as a stereoscopic system permitting the positional plotting of the base points of the reference structure SR, or of other points of the nonhomogeneous structure SNH, with respect to the second reference frame R2. The positional plotting can be done for example by appending a laser beam emission system making it possible to illuminate successively the points whose coordinates are sought, appropriate software making it possible to then determine the position of these points one by one with respect to R2. This software will not be described since it can consist of position and shape recognition software normally available on the market.

According to another variant, the marker means 3 can comprise a telemetry system.

In this case, the marks of the structure SR can consist of small radiotransmitters implanted for example on the relevant points of the patient's head and designed to be visible on the two-dimensional images, appropriate electromagnetic or optical sensors (not shown) being provided in order to determine the coordinates of the said marks in the reference frame R2 or in a reference frame tied to the latter.

It is important to note here that the general function of the base points of the reference structure is, on the one hand, to be individually localizable on the reference structure, in order to deduce from this the coordinates in R2, and on the other hand, to be visualizable on the two-dimensional images so as to be identified (by their coordinates in R1) and included in the representation RSR on the screen.

It can therefore involve special marks affixed at arbitrary points of the lateral surface of the structure SNH, or else at notable points of the latter, or else, when the notable points can in themselves be localized with high precision both on the structure SNH and on the 2D sections, notable points totally devoid of marks.

In FIG. 2 a plurality of marks, denoted M1 to Mi, these marks, in the case where the nonhomogeneous structure consists of the head of a patient, being localized for example between the eyebrows of the patient, on the latter's temples, and at the apex of the skull at a notable point such as the frontal median point.

More generally, for a substantially ovoid volume constituting the nonhomogeneous structure, there is advantageously provision for four base points at least on the outer surface of the volume.

Thus, as has been represented in FIG. 3, the four marks M1 to M4 of the reference structure are distributed so as preferably to define a more or less symmetric tetrahedron. The symmetry of the tetrahedron, represented in FIG. 3, is materialized by the vertical symmetry plane PV and the horizontal symmetry plane PH.

According to an advantageous characteristic, as will be seen later, the means of elaborating the reference frame transfer tools are designed to select three points of the tetrahedron which will define the “best plane” for the reference frame transfer.

Also, the presence of four or more points enables the additional point(s) to validate a specified selection.

More precisely, the presence of a minimum of four base points on the reference structure makes it possible to search for the minimum distortion between the points captured on the patient by the marker means consisting for example of the three-dimensional probe and the images of these points on the representation by three-dimensional imaging, the coordinates of which are calculated during processing. The best plane of the tetrahedron described earlier, that is to say the plane for which the uncertainty in the coordinates of the points between the points actually captured by the three-dimensional probe and the points of the representation of the reference structure RSR, is minimal, then becomes the reference plane for the reference frame transfer. Thus, the best correlation will be established between a modeled direction of intervention and a modeled origin of intervention, on the one hand, and the action of the member 50. Preferably, the origin of intervention will be placed at the center of the region in which the intervention is to be carried out, that is to say a tumor observed or treated for example.

Furthermore, it will be possible to take the noted residual uncertainty into account in order to effect the representation of the model and of the tools on the dynamic display means.

A more detailed description of the means of intervention 5 will now be given in connection with FIG. 4.

Preferably, the means of intervention 5 comprise a carriage 52 which is translationally mobile along the operating table TO, for example on a rack, denoted 54, whilst being driven by a motor, not shown, itself controlled by the computer MO2 for example, via an appropriate link. This movement system will not be described in detail since it corresponds to a conventional movement system available on the market. As a variant, the carriage 52 can be mobile over a distinct path separated from the operating table TO, or immobile with respect to the operating table and then constitute a support.

The support carriage 52 comprises in the first place a sighting member OV, constituting the above-mentioned sighting system, which can consist of a binocular telescope.

The sighting member OV enables the surgeon, prior to the actual intervention, or during the latter, to sight the presumed position of the region in which the intervention is to be carried out.

Furthermore, and in a non-limiting manner, with the sighting member OV can be associated a helium-neon laser emission system, denoted EL, making it possible to secure the aiming of a fine positioning or sighting laser beam on the structure SNH and in particular, as will be seen in detail later, to indicate to the surgeon the position of an entry point PE prior to the intervention, to enable the latter to open the skull at the appropriate location, and likewise to indicate to him what the direction of intervention will be. Additionally, the illuminating of the relevant point of the nonhomogeneous structure or at the very least the lateral surface of the latter enables the video cameras 31 and 32 to carry out, if necessary, a positional plotting.

Preferably, a system for measuring position by telemetry 53 is provided to secure the precise measurement of the position of the support carriage 52 of the sighting member OV and of the laser emission system EL. During the operation, and in order to secure the intervention, the carriage 52 can be moved along the rack 54, the position of the carriage 52 being measured very precisely by means of the system 53. The telemetry system 53 is interconnected with the microcomputer MO2 by an appropriate link.

The means of intervention 5 can advantageously consist of a guide arm 51 for the active member 50.

The guide arm 51 can advantageously consist of several hinged segments, each hinge being equipped with motors and resolvers making it possible to secure control of movement of the end of the support arm and the positional plotting of this same end and therefore of the active member 50 according to six degrees of freedom with respect to the carriage 52. The six degrees of freedom comprise, of course, three translational degrees of freedom with respect to a reference frame tied to the carriage 52 and three rotational degrees of freedom along these same axes.

Thus, the support arm 51 and the member 50 are marked in terms of instantaneous position with respect to the second reference frame R2, on the one hand by way of the positional plot of the mobile carriage 52 and, on the other hand, by way of the resolvers associated with each hinge of the support arm 51.

In the case of an intracranial neurosurgical surgical intervention, the active member 50 can be removed and can consist of a trephining tool, a needle or radioactive or chemical implant, a laser or radioisotope emission head or an endoscopic viewing system. These various members will not be described since they correspond to instruments normally used in neurosurgery.

The materializing of the modeled direction of intervention can be effective by means of the laser emitter EL. This sighting being performed, the guide arm 51 can then be brought manually or in steered manner into superposition with the direction of intervention Δ.

In the case of manual positioning, the resolvers associated with the sighting member OV and the laser emitter EL, if appropriate, make it possible to record the path of the sighting direction, constituting in particular the actual direction of intervention, on the representation of the nonhomogeneous structure in the dynamic display means 1.

Furthermore, as will be described later and in preferential manner, the intervening surgeon will be able firstly to define a simulated intervention path and steer thereto the movements of the active member 50 in the nonhomogeneous structure in order effectively to secure all or part of the intervention.

In this case, the progress of the intervention tool 50 is then steered directly to the simulated path (data ORI, Δ) by involving the reference frame transfer means 11 in order to express the path in the reference frame R2.

A more detailed description of the implementation of the operational mode of the system of the invention will now be described in connection with FIGS. 5a and 5b.

According to FIG. 5a, the first step consists in obtaining and organizing in memory the two-dimensional image data (step 100). Firstly, the nonhomogeneous structure SNH is prepared. In the case of a neurosurgical intervention for example, this means that the patient's head can be equipped with marks constituting the base points of the reference structure SR. These marks can be produced by means of points consisting of a dye partially absorbing the X-rays, such as a radiopaque dye.

The abovementioned marks are implanted by the surgeon on the patient's head at notable points of the latter [sic], and images can then be taken of the nonhomogeneous structure SNH by tomography for example, by means of an apparatus of the X-ray scanner type.

This operation will not be described in detail since it corresponds to conventional operations in the field of medical imaging.

The two-dimensional image data obtained are then constituted as digitized data in the file 10, these data being themselves marked with respect to the reference frame R1 and making it possible, on demand, to restore the two-dimensional images onto the dynamic display means 1, these images representing superimposed sections of the nonhomogeneous structure SNH.

From the digitized image data available to the surgeon, the latter then proceeds, as indicated at 101 in FIG. 5a, to select the structures of interest of the abovementioned images.

The purpose of this step is to facilitate the work of the surgeon by forming three-dimensional images which contain only the contours of the elements of the structure which are essential for geometrical definition and real-time monitoring of the progress of the intervention.

In the case where the nonhomogeneous structure SNH consists of the head of a patient, an analysis of the two-dimensional image data makes it possible, from values of optical density of the corresponding image-points, straight-away to extract the contours of the skull, to check the distance scales, etc.

Preferably, the abovementioned operations are performed on a rectangle of interest for a given two-dimensional image, this making it possible, by moving the rectangle of interest, to cover the whole of the image.

The above analysis is performed by means of suitable software which thus makes it possible to extract and vectorize the contours of the structures which will be modeled in the representations RSNH and RSR.

The structures modeled in the case of a neurosurgical intervention are for example the skull, the cerebral ventricles, the tumor to be observed or treated, the falx cerebri, and the various functional regions.

According to a feature of the interactive system of the invention, the surgeon may have available a digitizing table or other graphics peripheral making it possible, for each displayed two-dimensional image, to rectify or complete the definition of the contour of a particular region of interest.

It will be noted finally that by superimposing the extracted contours on the displayed two-dimensional image, the surgeon will be able to validate the extractions carried out.

The extracted contours are next processed by sampling points to obtain their coordinates in the reference frame R1, it being possible to constitute these coordinates as an ASCII type file. This involves step 102 for generating the three-dimensional data base.

This step is followed by a step 103 of reconstructing the three-dimensional model. This step consists firstly, with the aid of the CAD type software, in carrying out on the basis of the contours of the structures of interest constituted as vectorized two-dimensional images an extrapolation between the various sectional planes.

The abovementioned extrapolation is carried out preferably by means of a “B-spline” type algorithm which seems best suited. This extrapolation transforms a discrete item of information, namely the successive sections obtained by means of the scanner analysis, into a continuous model permitting three-dimensional representation of the volume envelopes of the structures.

It should be noted that the reconstruction of the volumes constituting the structures of interest introduces an approximation related in particular to the spacing and non-zero thickness of the acquisition sections. An important characteristic of the invention, as explained in detail elsewhere, is on the one hand to minimize the resulting uncertainties in the patient-model correlation, and on the other hand to take into account the residual uncertainties.

The CAD type software used possesses standard functions which enable the model to be manipulated in space by displaying it from different viewpoints through just a criterion defined by the surgeon (step 104).

The software can also reconstruct sectional representation planes of the nonhomogeneous structure which differ from the planes of the images from the file 10, this making it possible in particular to develop knowledge enhancing the data for the representation by building up a neuro-anatomical map.

The surgeon can next (step 105) determine a model of intervention strategy taking into account the modeled structures of interest, by evaluating the distance and angle ratios on the two-and three-dimensional representations displayed.

This intervention strategy will consist, in actual fact, on the one hand in localizing the tumor and in associating therewith a “target point”, which will subsequently be able to substitute for the origin common to all the objects (real and images) treated by the system, and on the other hand in determining a simulated intervention path respecting to the maximum the integrity of the structures of interest. This step can be carried out “in the office”, involving only the workstation.

Once this operation is performed and prior to the intervention, the following phase consists in implementing the steps required to establish as exact as possible a correlation between the structure SNH (real world) and the representation RSNH (computer world). This involves steps 106 to 109 of FIG. 5b.

Firstly, as represented in FIG. 5b at step 107, marking of the base points of the reference structure SR with respect to the second reference frame is carried out with the aid of the marker means 3, by delivering to the system the coordinates X2, Y2, Z2 of the said base points.

The following step 106 consists in identifying on the representations RSNH and RSR displayed on the screen the images of the base points which have just been marked. More precisely, with the aid of appropriate graphics peripherals, these representations (images) of the base points are selected one by one, the workstation supplying on each occasion (in this instance to the computer MO2) the coordinates of these points represented in the reference frame R1.

Thus the computer MO2 has available a first set of three-dimensional coordinates representing the position of the base points in R2, and a second set of three-dimensional coordinates representing the position of the representations of the base points in R1.

According to an essential feature of the invention, these data will be used to elaborate at 108, 109, tools for reference frame transfer (from R1 to R2 and vice versa) by calling upon an intermediate reference frame determined from the base points and constituting an intermediate reference frame specific to the reconstructed model.

More precisely, the intermediate reference frame is constructed from three base points selected so that, in this reference frame, the coordinates of the other base points after transfer from R2 and the coordinates of the representations of these other base points after transfer from R1 are expressed with the greatest consistency and minimum distortion.

When the step of elaborating the reference frame transfer tools is concluded, these tools can be used by he system to secure optimal coupling between the real world and the computer world (step 1110).

Furthermore, according to a subsidiary feature of the present invention, the system can create on the display means a representation of the nonhomogeneous structure and of the intervention member which takes account of the deviations and distortions remaining after the “best” reference frame transfer tools have been selected (residual uncertainties). More precisely, from these deviations can be deduced by the calculating means a standard error likely to appear in the mutual positioning between the representation of the nonhomogeneous structure and the representation of elements (tools, sighting axes, etc.) referenced on R2 when using the reference frame transfer tools. This residual uncertainty, which may in practice be given substance through an error matrix, can be used for example to represent certain contours (tool, structures of interest to be avoided during the intervention, etc.) with dimensions larger than those which would normally be represented starting from the three-dimensional data base or with the aid of coordinates marked in R2, the said larger dimensions being deduced from the “normal” dimensions by involving the error matrix. For example, if the member were represented normally, in transverse section, by a circle of diameter D1, a circle of diameter D2>D1 can be represented in substance, with the difference D2-D1 deduced from the standard error value. In this way, when a direction of intervention will be selected making it possible to avoid traversing certain structures of interest, the taking into account of an “enlarged” size of the intervention tool will eradicate any risk of the member, because of the abovementioned errors, accidently traversing these structures.

Back at step 105, and as will be seen in more detail with reference to FIGS. 9a and 9b, the reference origin of intervention ORI and the direction of intervention Δ, that is to say the simulated intervention path, can be determined according to various procedures.

According to a first procedure, the trajectory can be defined from two points, namely an entry point PE (FIG. 3) and a target point, that is to say substantially the center of the structure of interest consisting of the tumor to be observed or treated. Initially, these two points are localized on the model represented on the screen.

According to a second methodology, the trajectory can be determined from the abovementioned target point and from a direction which takes account of the types of structures of interest and of their positions with a view to optimally respecting their integrity.

After the abovementioned step 108, the surgeon can at step 1110 perform the actual intervention.

The intervention can advantageously be performed by steering the tool or active member over the simulated intervention path, determined in step 1110.

As a variant, given that the support arm 51 for the active member, equipped with its resolvers, continuously delivers the coordinates in R2 of the said active member to the system, it is also possible to perform the operation manually or semi-manually, by monitoring on the screen the position and motions of a representation of the tool and by comparing them with the simulated, displayed intervention path.

It will furthermore be noted that the modeled direction of intervention can be materialized with the aid of the laser beam described earlier, the positioning of the latter (with respect to R2) being likewise carried out by virtue of the reference frame transfer tools.

Certain functional features of the system of the invention will now be described in further detail with reference to FIGS. 6, 7, 8, 9a and 9b.

The module for elaborating the reference frame transfer tools (steps 108, 109 of FIG. 5b) will firstly be described with reference to FIG. 6.

This module comprises a first sub-module 1001 for acquiring three points A, B, C, the images of the base points of SR on the representation RSNH (the coordinates of these points being expressed in the computer reference frame R1), by successive selections of these points on the representation. To this effect, the surgeon is led, by means of a graphics interface such as a “mouse” to point successively at the three selected points A, B, C.

The module for preparing the transfer tools also comprises a second sub-module, denoted 1002, for creating a unit three-dimensional orthogonal matrix M, this matrix being characteristic of a right-handed orthonormal basis represented by three unit vectors {right arrow over (i)}, {right arrow over (j)}, {right arrow over (k)}, which define an intermediate reference frame tied to R1.

The unit vectors {right arrow over (i)}, {right arrow over (j)} and {right arrow over (k)} are given by the relations:

j = AB / AB k = ( BA Λ BC ) / BA Λ BC i = j Λ k
where ∥ ∥ designates the norm of the relevant vector.

In the above relations, the sign “Λ” designates the vector product of the relevant vectors.

Similarly, the module for preparing the transfer tools comprises a third sub-module, denoted 1003, for acquiring three base points D, E, F, of the structure SR, these three points being those whose images on the model are the points A, B, C respectively. For this purpose, the surgeon, for example by means of the tactile tip 30, successively senses these three points to obtain their coordinates in R2.

The sub-module 1003 is itself followed, as represented in FIG. 6, by a fourth sub-module 1004 for creating a unit three-dimensional orthogonal matrix N, characteristic of a right-handed orthonormal basis comprising three unit vectors {right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′ and which is tied to the second reference frame R2 owing to the fact that the nonhomogeneous structure SNH is positionally tied with respect to this reference frame.

The three unit vectors {right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′ are defined by the relations:

j = DE / DE k = ( ED Λ EF ) / ED Λ EF i = j Λ k

As indicated above, to the extent that the base points of the reference structure can be marked in R2 with high precision, so their representation in the computer base R1 is marked with a certain margin of error given on the one hand the non-zero thickness (typically from 2 to 3 mm) of the slices represented by the two-dimensional images from the file 10, and on the other hand (in general to a lesser extent) the definition of each image element or pixel of a section.

According to the invention, once a pair of transfer matrices M, N has been elaborated with selected points A, B, C, D, E, F, it is sought to validate this selection by using one or more additional base points; more precisely, for the or each additional base point, this point is marked in R2 with the aid of the probe 30, the representation of this point is marked in R1 after selection on the screen, and then the matrices N and M are applied respectively to the coordinates obtained, in order to obtain their expressions in the bases ({right arrow over (i)}′, {right arrow over (j)}′, {right arrow over (k)}′) and ({right arrow over (i)}, {right arrow over (j)}, {right arrow over (k)}) respectively. If these expressions are in good agreement, these two bases can be regarded as a single intermediate reference frame, this securing the exact as possible mathematical coupling between the computer reference frame R1 tied to the model and the “real” reference frame R2 tied to the patient.

In practice, the module for elaborating the reference frame transfer tools can be designed to perform steps 1001 to 1004 in succession on basic triples which differ on each occasion (for example, if four base points have been defined associated with four representations in RSR, there are four possible triples), in order to perform the validation step 1005 for each of these selections and finally in order to choose the triple for which the best validation is obtained, that is to say for which the deviation between the abovementioned expressions is smallest. This triple defines the “best plane” mentioned elsewhere in the description, and results in the “best” transfer matrices M and N.

As a variant, it will be possible for the selection of the best plane to be made at least in part by the surgeon by virtue of his experience.

It should be noted that the reference frame transfer will only be concluded by supplementing the matrix calculation with the matrices M, N with a transfer of origin, so as to create a new common origin for example at the center of the tumor to be observed or treated (point ORI). This transfer of origin is effected simply by appropriate subtraction of vectors on the one hand on the coordinates in R1, and on the other hand on the coordinates in R2. These vectors to be subtracted are determined after localization of the center of the tumor on the representation.

Furthermore, the means described above for establishing the coupling between the patient's world and the model's world can also be used to couple to the model's world that of map data, also stored in the workstation and expressed in a different reference frame denoted R3. In this case, since these data contain no specific visible mark, the earlier described elaboration of matrices is performed by substituting for these marks the positions of notable points of the patient's head. These may be temporal points, the frontal median point, the apex of the skull, the center of gravity of the orbits of the eyes, etc.

The corresponding points of the model can be obtained either by selection by mouse or graphics tablet on the model, or by sensing on the patient himself and then using the transfer matrices.

The above step of elaborating the reference frame transfer tools, conducted in practice by the calculating means 4, makes it possible subsequently to implement the reference frame transfer means (FIGS. 7 and 8).

With reference to FIG. 7, the first transfer sub-module 201 comprises a procedure denoted 2010 for aquiring the coordinates XM, YM, ZM, expressed in R1, of the point to be transferred, by selecting on the representation.

The procedure 2010 followed by a procedure 2011 for calculating the coordinates XP, YP, ZP (expressed in R2) of the corresponding real point on the patient through the transformation:

The procedure 2011 is followed by a processing procedure 2012 utilizing the calculated coordinates XP, YP, ZP, for example to indicate the corresponding point on the surface of the structure SNH by means of the laser emission system EL, or again to secure the intervention at the relevant point with coordinates XP, YP, ZP (by steering the active member).

Conversely, in order to secure a transfer from SNH to RSNH, the second sub-module 202 comprises (FIG. 8) a procedure denoted 2020 for acquiring on the structure SNH the coordinates XP, YP, ZP (expressed in R2) of a point to be transferred.

These coordinates can be obtained by means of the tactile tip 30 for example. The procedure 2020 is followed by a procedure 2021 for calculating the corresponding coordinates XM, YM, ZM in R1 through the transformation:

{XM, YM, ZM}=N*M−1*{XP, YP, ZP}

A procedure 2022 next makes it possible to effect the displaying of the point with coordinates XM, YM, ZM on the model or again of a straight line or of a plane passing through this point and furthermore meeting other criteria.

It will be noted here that the two sub-modules 201, 202 can used [sic] by the surgeon at any moment for the purpose of checking the valid nature of the transfer tools; in particular, it is possible to check at any time that a real base point, with coordinates known both in R2 and R1 (for example a base point of SR or an arbitrary notable point of the structure SNH visible on the images), correctly relocates with respect to its image after transferring the coordinates in step 2011.

In the event of an excessive difference, a new step of elaboration of the transfer tools is performed.

Furthermore, the sub-modules 201, 202 can be designed to also integrate the taking into account of the residual uncertainty, as spoken of above, so as for example to represent on the screen a point sensed not in a pointwise manner, but in the form for example of a circle or a sphere representing the said uncertainty.

From a simulated intervention path, for example on the representation RSNH, or from any other straight line selected by the surgeon, the invention furthermore enables the model to be represented on the screen from a viewpoint corresponding to this straight line. Thus the third transfer subroutine comprises, as represented in FIGS. 9a and 9b, a first module 301 for visualizing the representation in a direction given by two points and a second module 302 for visualizing the representation in a direction given by an angle of elevation and an angle of azimuth.

The first module 301 for visualizing the representation in a direction given by two points comprises a first sub-module denoted 3010 permitting acquisition of the two relevant points which will define the selected direction. The coordinates of these points are expressed in the reference frame R1, these points having either been acquired previously on the nonhomogeneous structure SNH for example by means of the tactile tip 30 and then subjected to the reference frame transfer, or chosen directly on the representation by means of the graphics interface of the “mouse” type.

The first sub-module 3010 is followed by a second sub-module denoted 3011 permitting the creation of a unit, orthogonal three-dimensional matrix V characteristic of a right-handed orthonormal basis {right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″ the unit vectors {right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″, being determined through the relations:

k = AB / AB ; i · k = O ; i · z = O ; i = 1 ; j = k Λ j
where “Λ” represents the vector product and “.” symbolizes the scalar product.

The sub-module 3011 is followed by a routine 3012 making it possible to secure for all the points of the entities (structures of interest) of the three-dimensional data base of coordinates XW, YW, ZW in R1 a conversion into the orthonormal basis ({right arrow over (i)}″, {right arrow over (j)}″, {right arrow over (k)}″) by the relation:

{XV, YV, ZV}=V*{XW, YW, ZW}

The subroutine 3013 is then followed by a subroutine 3014 for displaying the plane i″, j″, the subroutines 3013 and 3014 being called up for all the points, as symbolized by the arrow returning to block 3012 in FIG. 9a.

When all the points have been processed, an output module 3015 permits return to a general module, which will be described later in the description. It is understood that this module enables two-dimensional images to be reconstructed in planes perpendicular to the direction defined by A and B.

In the same way, the second module 302 (FIG. 9b) for visualizing the representation from a viewpoint given by an angle of elevation and an angle of azimuth comprises a first sub-module 3020 for acquiring the two angles in the representation frame of reference.

The selection of the angles of elevation and of azimuth can be made by selecting from a predefined data base or by moving software cursers associated with each view or else by modification relative to a current direction, such as the modeled direction of intervention. The sub-module 3020 is itself followed by a second sub-module 3021 for creating a unit orthoganal three-dimensional matrix W characteristic of a right-handed orthonormal basis of unit vectors {right arrow over (i)}′″, {right arrow over (j)}′″, {right arrow over (k)}′″. They are defined by the relations:
{right arrow over (i)}″′·{right arrow over (k)}″′=O;
{right arrow over (k)}″′·{right arrow over (z)}″′=sin(azimuth)
{right arrow over (j)}″′·{right arrow over (z)}″′=O;
{right arrow over (i)}″′·{right arrow over (y)}=cos(elevation);
{right arrow over (i)}″″·{right arrow over (x)}″′=sin(elevation)
{right arrow over (j)}″′={right arrow over (k)}″′Λ{right arrow over (i)}″′

A routine 3022 is then called for all the points of the entities of the three-dimensional data base of coordinates XW, YW, ZW and enables a first sub-routine 3023 to be called permitting calculation of the coordinates of the relevant point in the right-handed orthonormal bases {right arrow over (i)}′″, {right arrow over (j)}′″ {right arrow over (k)}′″ through the transformation:

{XV, YV, ZV}=V*{XW, YW, ZW}

The sub-routine 3023 is itself followed by a sub-routine 3024 for displaying the plane i′″, j′″, the two sub-routines 3023 and 3024 then being called up for each point as symbolized by the return via the arrow to the block 3022 for calling the abovementioned routine. When all the points have been processed, an output sub-module 3025 permits a return to the general menu.

Of course, all of the programs, sub-routines, modules, sub-modules and routines destroyed earlier are managed by a general “menu” type program so as to permit interactive driving of the system by screen dialogue with the intervening surgeon by specific screen pages.

A more specific description of a general flow diagram illustrating this general program will now be given in connection with FIGS. 10a and 10b.

Thus, in FIG. 10a has been represented in succession a screen page 4000 relating to the loading of data from the digitized file 10, followed by a screen page 4001 making it possible to secure the parameterizing of the grey scales of the display on the dynamic display means 1 and to calibrate the image, for example.

The screen page 4001 is followed by a screen page 4002 making it possible to effect the generation of a global view and then a step or screen page 4003 makes it possible to effect an automatic distribution of the sections on the screen of the workstation.

A screen page 4004 makes it possible to effect a manual selection of sections and then a screen page 4005 makes it possible to effect the selection of the strategy (search for the entry points and for the possible directions of intervention, first localizing of the target (tumor . . . ) to be treated . . . ), as defined earlier, and to select the position and horizontal, sagittal and frontal distribution of the sections.

A screen page 4006 also makes it possible to effect a display of the settings of a possible stereotaxic frame.

It will be recalled that the reference structure SR advantageously replaces the stereotaxic frame formerly used to effect the marking of position inside the patient's skull.

There may furthermore be provided a screen-page 4007 for choosing strategic sections by three-dimensional viewing, on selection by the surgeon, and then at 4008 the aligning of the references of the peripherals (tool, sighting members, etc., with the aid of the probe 30.

A screen page 4009 is also provided to effect the search for the base points on the patient with the aid of the said probe, following which the steps of construction of the reference frame transfer tools and of actual reference frame transfer are performed, preferably in a user-transparent manner.

Another screen page 4010 is then provided, so as to effect the localizing of the target on the representation (for example a tumor to be observed or treated in the case of a neurosurgical intervention) in order subsequently to determine a simulated intervention path.

Then a new screen page 4011 makes it possible to effect the setting of the guides for the tool on the basis of this simulated path before opening up the skin and bone flaps on the patient's skull.

Then a new localizing step 4012 makes it possible to check whether the position of the guides corresponds correctly to the simulated intervention path.

The screen page 4012 is followed by a so-called intervention screen page, the intervention being performed in accordance with step 1110 of FIG. 5b.

A more detailed description of the interactive dialogue between the surgeon and the system during a surgical, and in particular a neurosurgical, intervention will follow with reference to FIG. 10c and to all of the preceding description.

The steps of FIG. 10c are also integrated in the general program mentioned earlier; there are undertaken in succession a first phase I (preparation of the intervention), then a second phase II, (prior to the actual intervention, the patient is placed in a condition for intervention, the reference structure SR being tied to the second reference frame R2), then a third phase III (intervention) and finally a post-intervention phase IV.

With a view to preparing the intervention, the system requests the surgeon (step 5000) to choose the elementary structures of interest (for example bones of the skull, ventricles, vascular regions, the tumor to be explored or treated, and the images of the marks constituting in the first reference frame the representation RSR).

The choice of the elementary structures of interest is made on the display of the tomographic images, for example, called up from the digitized file 10.

The system next performs, at step 5001, a modeling of the structures of interest, as described earlier. Then, the nonhomogeneous structure having been thus constituted as a three-dimensional model RSNH displayed on the screen, the intervening surgeon is then led to perform a simulation by three-dimensional imaging, at step 5002, with a view to defining the intervention path of the tool 50.

During phase II the patient being placed in a condition for intervention and his head and the reference structure SR being tied to the second reference frame R2, the surgeon performs at step 5003 a search for the position of the marks M1 to M4 constituting base points of the reference structure in the second reference frame R2, and then during a step 5004, performs a search for the position of the sighting systems, visualizing member OV, or of the tools and intervention instruments 50, still in the second reference frame R2, so as, if appropriate, to align these implements with respect to R2.

The system then performs the validation of the intervention/patient spaces and representation by three-dimensional imaging in order to determine next the common origin of intervention ORI. In other words, the matrix reference frame transfer described above is supplemented with the necessary origin translations (origins 01 and 02 aligned on ORI).

This operation is performed as described earlier.

Phase III corresponds to the intervention, during which the system effects at step 5006 a permanent coupling in real time between the direction of aim of the active member 50, and/or of the direction of aim of the sighting member OV (and if appropriate of the laser beam), with the direction of aim (of observation) simulated by three-dimensional imaging on the display means 1, and vice versa.

In the following step 5007, the coupling is effected of the movements and motions of the intervention instrument with their movements simulated by three-dimensional imaging, with automatic or manual conduct of the intervention.

As noted at 5008, the surgeon can be supplied with a permanent display of the original two-dimensional sectional images in planes specified with respect to the origin ORI and to the direction of intervention. Such a display enables the surgeon at any time to follow the progress of the intervention in real time and to be assured that the intervention is proceeding in accordance with the simulated intervention. In phase IV which is executed after the intervention, the system effects a saving of the data acquired during the intervention, this saving making it possible subsequently to effect a comparison in real time or deferred in the event of successive interventions on the same patient.

Furthermore, the saved data make it possible to effect a playback of the operations carried out with the option of detailing and supplementing the regions traversed by the active member 50.

Thus, a particularly powerful interactive system for local intervention has been described.

Thus, the system which is the subject of the present invention makes it possible to represent a model containing only the essential structures of the nonhomogeneous structure, this facilitating the work of preparation and of monitoring of the intervention by the surgeon.

Moreover, the system, by virtue of the algorithms used and in particular by minimizing the distortion between the real base points and their images in the 2D sections or the maps, makes it possible to establish a two-way coupling between the real world and the computer world through which the transfer errors are minimized, making possible concrete exploitation of the imaging data in order to steer the intervention tool.

To summarize, the system makes possible an ineractive [sic] medical usage not only to create a three-dimensional model of the nonhomogeneous structure but also to permit a marking in real time with respect to the internal structures and to guide the surgeon in the intervention phase.

More generally, the invention makes it possible to end up with a coherent system in respect of:

Accordingly, the options offered by the system are, in a non-limiting manner, the following:

Uhl, Jean Francois, Henrion, Joel, Scriban, Michel, Thiebaut, Jean-Baptiste

Patent Priority Assignee Title
10499997, Jan 03 2017 MAKO Surgical Corp. Systems and methods for surgical navigation
10856942, Mar 02 2018 Cilag GmbH International System and method for closed-loop surgical tool homing
11022421, Jan 20 2016 LUCENT MEDICAL SYSTEMS, INC Low-frequency electromagnetic tracking
11707330, Jan 03 2017 MAKO Surgical Corp. Systems and methods for surgical navigation
9950194, Sep 09 2014 Mevion Medical Systems, Inc.; MEVION MEDICAL SYSTEMS, INC Patient positioning system
Patent Priority Assignee Title
1576781,
1735726,
2407845,
2650588,
2697433,
3016899,
3017887,
3061936,
3073310,
3109588,
3294083,
3367326,
3439256,
3577160,
3614950,
3644825,
3674014,
3702935,
3704707,
3821469,
3868565,
3941127, Oct 03 1974 Apparatus and method for stereotaxic lateral extradural disc puncture
3983474, Feb 21 1975 KAISER AEROSPACE & ELECTRONICS CORPORATION, A CORP OF NV Tracking and determining orientation of object using coordinate transformation means, system and process
4017858, Jul 30 1973 KAISER AEROSPACE & ELECTRONICS CORPORATION, A CORP OF NV Apparatus for generating a nutating electromagnetic field
4037592, May 04 1976 Guide pin locating tool and method
4052620, Nov 28 1975 Picker Corporation Method and apparatus for improved radiation detection in radiation scanning systems
4054881, Apr 26 1976 KAISER AEROSPACE & ELECTRONICS CORPORATION, A CORP OF NV Remote object position locater
4117337, Nov 03 1977 General Electric Company Patient positioning indication arrangement for a computed tomography system
4173228, May 16 1977 Applied Medical Devices Catheter locating device
4182312, May 20 1977 Dental probe
4197855, Apr 04 1977 Siemens Aktiengesellschaft Device for measuring the location, the attitude and/or the change in location or, respectively, attitude of a rigid body in space
4202349, Apr 24 1978 Radiopaque vessel markers
4228799, Sep 28 1977 Method of guiding a stereotaxic instrument at an intracerebral space target point
4256112, Feb 12 1979 David Kopf Instruments Head positioner
4262306, Apr 27 1977 Method and apparatus for monitoring of positions of patients and/or radiation units
4287809, Aug 20 1979 Honeywell Inc. Helmet-mounted sighting system
4298874, Oct 18 1976 KAISER AEROSPACE & ELECTRONICS CORPORATION, A CORP OF NV Method and apparatus for tracking objects
4314251, Jul 30 1979 KAISER AEROSPACE & ELECTRONICS CORPORATION, A CORP OF NV Remote object position and orientation locater
4317078, Oct 15 1979 Ohio State University Research Foundation, The Remote position and orientation detection employing magnetic flux linkage
4319136, Nov 09 1979 Computerized tomography radiograph data transfer cap
4328548, Apr 04 1980 CHITTENDEN BANK Locator for source of electromagnetic radiation having unknown structure or orientation
4328813, Oct 20 1980 Medtronic, Inc. Brain lead anchoring system
4339953, Aug 29 1980 Aisin Seiki Company, Ltd.; AISIN SEIKI COMPANY, LIMITED, Position sensor
4341220, Apr 13 1979 Sherwood Services AG Stereotactic surgery apparatus and method
4346384, Jun 30 1980 CHITTENDEN BANK Remote object position and orientation locator
4358856, Oct 31 1980 General Electric Company Multiaxial x-ray apparatus
4368536, Dec 17 1979 Siemens Aktiengesellschaft Diagnostic radiology apparatus for producing layer images
4396885, Jun 06 1979 Thomson-CSF Device applicable to direction finding for measuring the relative orientation of two bodies
4396945, Aug 19 1981 Solid Photography Inc. Method of sensing the position and orientation of elements in space
4403321, Jun 14 1980 U S PHILIPS CORPORATION Switching network
4418422, Feb 22 1978 Howmedica International, Inc. Aiming device for setting nails in bones
4419012, Sep 11 1979 GEC-Marconi Limited Position measuring system
4422041, Jul 30 1981 UNITED STATES of AMERICA, AS REPRESENTED BY THE SECRETARY OF THE ARMY Magnet position sensing system
4431005, May 07 1981 MBO LABORATORIES, INC , A CORP OF MA Method of and apparatus for determining very accurately the position of a device inside biological tissue
4485815, Aug 30 1982 Device and method for fluoroscope-monitored percutaneous puncture treatment
4506676, Sep 10 1982 Radiographic localization technique
4543959, Jun 04 1981 Instrumentarium Oy Diagnosis apparatus and the determination of tissue structure and quality
4548208, Jun 27 1984 Medtronic, Inc. Automatic adjusting induction coil treatment device
4571834, Feb 17 1984 XENON RESEARCH, INC Knee laxity evaluator and motion module/digitizer arrangement
4572198, Jun 18 1984 Varian, Inc Catheter for use with NMR imaging systems
4583538, May 04 1984 Sherwood Services AG Method and apparatus for stereotaxic placement of probes in the body utilizing CT scanner localization
4584577, Oct 20 1982 BROOKES & GA5TEHOUSE LIMITED Angular position sensor
4608977, Aug 29 1979 Sherwood Services AG System using computed tomography as for selective body treatment
4613866, May 13 1983 CHITTENDEN BANK Three dimensional digitizer with electromagnetic coupling
4617925, Sep 28 1984 SANDSTROM, GUNBRITT VIVIAN MONICA Adapter for definition of the position of brain structures
4618978, Oct 21 1983 Sherwood Services AG Means for localizing target coordinates in a body relative to a guidance system reference frame in any arbitrary plane as viewed by a tomographic image through the body
4621628, Sep 09 1983 Ortopedia GmbH Apparatus for locating transverse holes of intramedullary implantates
4625718, Jun 08 1984 HOWMEDICA INTERNATIONAL S DE R L Aiming apparatus
4638798, Sep 10 1980 Stereotactic method and apparatus for locating and treating or removing lesions
4642786, May 25 1984 POSITION ORIENTATION SYSTEM, LTD Method and apparatus for position and orientation measurement using a magnetic field and retransmission
4645343, Nov 11 1981 U S PHILIPS CORPORATION A CORP OF DE Atomic resonance line source lamps and spectrophotometers for use with such lamps
4649504, May 22 1984 CAE Electronics, Ltd. Optical position and orientation measurement techniques
4651732, Mar 17 1983 IMAGING ACCESSORIES, INC Three-dimensional light guidance system for invasive procedures
4653509, Jul 03 1985 The United States of America as represented by the Secretary of the Air Guided trephine samples for skeletal bone studies
4659971, Aug 16 1984 SEIKO INSTRUMENTS & ELECTRONICS LTD Robot controlling system
4660970, Nov 25 1983 Carl-Zeiss-Stiftung Method and apparatus for the contact-less measuring of objects
4673352, Jan 10 1985 Device for measuring relative jaw positions and movements
4688037, Aug 18 1980 CHITTENDEN BANK Electromagnetic communications and switching system
4701049, Jun 22 1983 B V OPTISCHE INDUSTRIE DE OUDE DELFT Measuring system employing a measuring method based on the triangulation principle for the non-contact measurement of a distance from the surface of a contoured object to a reference level. _
4705395, Oct 03 1984 LMI TECHNOLOGIES INC Triangulation data integrity
4705401, Aug 12 1985 CYBERWARE LABORATORY INC , A CORP OF CALIFORNIA Rapid three-dimensional surface digitizer
4706665, Dec 17 1984 Frame for stereotactic surgery
4709156, Nov 27 1985 EX-CELL-O CORPORATION, A CORP OF MICHIGAN Method and apparatus for inspecting a surface
4710708, May 04 1979 Baker Hughes Incorporated Method and apparatus employing received independent magnetic field components of a transmitted alternating magnetic field for determining location
4719419, Jul 15 1985 HARRIS GRAPHICS CORPORATION, MELBOURNE, FL , A CORP OF DE Apparatus for detecting a rotary position of a shaft
4722056, Feb 18 1986 Trustees of Dartmouth College Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope
4722336, Jan 25 1985 Placement guide
4723544, Jul 09 1986 Hemispherical vectoring needle guide for discolysis
4727565, Nov 14 1983 TURON AB A CORPORATION OF SWEDEN Method of localization
4733969, Sep 08 1986 CyberOptics Corporation Laser probe for determining distance
4737032, Aug 26 1985 CYBERWARE LABORATORY, INC , 2062 SUNSET DRIVE, PACIFIC GROVE, CA , 93950, A CORP OF CA Surface mensuration sensor
4737794, Dec 09 1985 CHITTENDEN BANK Method and apparatus for determining remote object orientation and position
4737921, Jun 03 1985 PICKER INTERNATIONAL, INC Three dimensional medical image display system
4742356, Dec 09 1985 CHITTENDEN BANK Method and apparatus for determining remote object orientation and position
4742815, Jan 02 1986 Computer monitoring of endoscope
4743770, Sep 22 1986 Mitutoyo Mfg. Co., Ltd. Profile-measuring light probe using a change in reflection factor in the proximity of a critical angle of light
4743771, Jun 17 1985 GENERAL SCANNING, INC Z-axis height measurement system
4745290, Mar 19 1987 FRANKEL, DAVID Method and apparatus for use in making custom shoes
4750487, Nov 24 1986 Stereotactic frame
4753528, Dec 13 1983 Quantime, Inc. Laser archery distance device
4761072, Oct 03 1984 LMI TECHNOLOGIES INC Electro-optical sensors for manual control
4764016, Jun 14 1985 BENGTSSON, ANDERS Instrument for measuring the topography of a surface
4771787, Dec 12 1985 RICHARD WOLF GMBH, KNITTLINGEN, GERMANY Ultrasonic scanner and shock wave generator
4779212, Sep 27 1985 Distance measuring device
4782239, Apr 05 1985 U S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENT Optical position measuring apparatus
4788481, Mar 10 1986 Mitsubishi Denki Kabushiki Kaisha Numerical control apparatus
4791934, Aug 07 1986 Brainlab AG Computer tomography assisted stereotactic surgery system and method
4793355, Apr 17 1987 Biomagnetic Technologies, Inc. Apparatus for process for making biomagnetic measurements
4794262, Dec 03 1985 SATO, YUKIO Method and apparatus for measuring profile of three-dimensional object
4797907, Aug 07 1987 OEC MEDICAL SYSTEMS, INC Battery enhanced power generation for mobile X-ray machine
4803976, Oct 03 1985 Synthes Sighting instrument
4804261, Mar 27 1987 Anti-claustrophobic glasses
4805615, Jul 02 1985 SCHAERER MAYFIELD USA, INC Method and apparatus for performing stereotactic surgery
4809694, May 19 1987 Biopsy guide
4821200, Apr 14 1986 JONKOPINGS LANS LANDSTING, A CORP OF SWEDEN Method and apparatus for manufacturing a modified, three-dimensional reproduction of a soft, deformable object
4821206, Nov 27 1984 Photo Acoustic Technology, Inc. Ultrasonic apparatus for positioning a robot hand
4821731, Apr 25 1986 SURGICAL NAVIGATION TECHNOLOGIES, INC Acoustic image system and method
4822163, Jun 26 1986 Rudolph Technologies, Inc Tracking vision sensor
4825091, Feb 05 1987 Carl-Zeiss-Stiftung Optoelectronic distance sensor with visible pilot beam
4829373, Aug 03 1987 VEXCEL IMAGING CORPORATION Stereo mensuration apparatus
4836778, May 26 1987 Vexcel Corporation Mandibular motion monitoring system
4838265, May 24 1985 Sherwood Services AG Localization device for probe placement under CT scanner imaging
4841967, Jan 30 1984 Positioning device for percutaneous needle insertion
4845771, Jun 29 1987 Picker International, Inc.; PICKER INTERNATIONAL, INC Exposure monitoring in radiation imaging
4849692, Oct 09 1986 BAE SYSTEMS PLC Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
4860331, Sep 12 1988 Izi Corporation Image marker device
4862893, Dec 08 1987 SURGICAL NAVIGATION TECHNOLOGIES, INC Ultrasonic transducer
4869247, Mar 11 1988 UNIVERSITY OF VIRGINIA ALUMNI, THE Video tumor fighting system
4875165, Nov 27 1987 University of Chicago Method for determination of 3-D structure in biplane angiography
4875478, Apr 10 1987 Portable compression grid & needle holder
4884566, Apr 15 1988 The University of Michigan System and method for determining orientation of planes of imaging
4889526, Aug 27 1984 RAUSCHER, ELIZABETH A & VAN BISE, WILLIAM L , CO-OWNERS Non-invasive method and apparatus for modulating brain signals through an external magnetic or electric field to reduce pain
4896673, Jul 15 1988 Medstone International, Inc. Method and apparatus for stone localization using ultrasound imaging
4905698, Sep 13 1988 SMITHS MEDICAL MD, INC Method and apparatus for catheter location determination
4923459, Sep 14 1987 Kabushiki Kaisha Toshiba Stereotactics apparatus
4931056, Sep 04 1987 Neurodynamics, Inc. Catheter guide apparatus for perpendicular insertion into a cranium orifice
4945305, Oct 09 1986 BAE SYSTEMS PLC Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields
4945914, Nov 10 1987 MARKER, LLC Method and apparatus for providing related images over time of a portion of the anatomy using at least four fiducial implants
4951653, Mar 02 1988 LABORATORY EQUIPMENT, CORP , A CORP OF INDIANA Ultrasound brain lesioning system
4955891, Jul 02 1985 OHIO MEDICAL INSTRUMENT COMPANY, INC , A CORP OF OHIO Method and apparatus for performing stereotactic surgery
4961422, Jan 21 1983 MED INSTITUTE, INC A CORPORATION OF IN Method and apparatus for volumetric interstitial conductive hyperthermia
4977655, Apr 25 1986 SURGICAL NAVIGATION TECHNOLOGIES, INC Method of making a transducer
4989608, Jul 02 1987 RATNER, ADAM V Device construction and method facilitating magnetic resonance imaging of foreign objects in a body
4991579, Nov 10 1987 MARKER, LLC Method and apparatus for providing related images over time of a portion of the anatomy using fiducial implants
5002058, Apr 25 1986 SURGICAL NAVIGATION TECHNOLOGIES, INC Ultrasonic transducer
5005592, Oct 27 1989 Becton Dickinson and Company Method and apparatus for tracking catheters
5013317, Feb 07 1990 Smith & Nephew Richards Inc. Medical drill assembly transparent to X-rays and targeting drill bit
5016639, Jul 18 1988 Method and apparatus for imaging the anatomy
5017139, Jul 05 1990 Mechanical support for hand-held dental/medical instrument
5027818, Dec 03 1987 UNIVERSITY OF FLORIDA RESEARCH FOUNDATION, INCORPORATED A NOT FOR PROFIT CORP OF FLORIDA Dosimetric technique for stereotactic radiosurgery same
5030196, Apr 23 1980 Inoue-Japax Research Incorporated Magnetic treatment device
5030222, May 09 1990 CCG, INC , A CORP OF NEBRASKA Radiolucent orthopedic chuck
5031203, Feb 09 1990 TRECHA INTELLECTUAL PROPERTIES, INC Coaxial laser targeting device for use with x-ray equipment and surgical drill equipment during surgical procedures
5042486, Sep 29 1989 Siemens Aktiengesellschaft Catheter locatable with non-ionizing field and method for locating same
5047036, Nov 17 1989 Stereotactic device
5050608, Jul 12 1988 MIZUHO IKAKOGYO CO , LTD System for indicating a position to be operated in a patient's body
5054492, Dec 17 1990 Boston Scientific Scimed, Inc Ultrasonic imaging catheter having rotational image correlation
5057095, Nov 16 1989 Surgical implement detector utilizing a resonant marker
5059789, Oct 22 1990 International Business Machines Corp. Optical position and orientation sensor
5078140, May 08 1986 Imaging device - aided robotic stereotaxis system
5079699, Nov 27 1987 Picker International, Inc. Quick three-dimensional display
5086401, May 11 1990 INTERNATIONAL BUSINESS MACHINES CORPORATION, A CORP OF NY Image-directed robotic system for precise robotic surgery including redundant consistency checking
5094241, Nov 10 1987 MARKER, LLC Apparatus for imaging the anatomy
5097839, Jul 18 1988 Apparatus for imaging the anatomy
5098426, Feb 06 1989 AMO Manufacturing USA, LLC Method and apparatus for precision laser surgery
5099845, May 24 1989 Micronix Pty Ltd. Medical instrument location means
5099846, Dec 23 1988 Method and apparatus for video presentation from a variety of scanner imaging sources
5105829, Nov 16 1989 Surgical implement detector utilizing capacitive coupling
5107839, May 04 1990 HOUDEK, PAVEL V , Computer controlled stereotaxic radiotherapy system and method
5107843, Apr 06 1990 GENDEX-DEL MEDICAL IMAGING CORP Method and apparatus for thin needle biopsy in connection with mammography
5107862, Nov 16 1989 Surgical implement detector utilizing a powered marker
5109194, Dec 01 1989 Sextant Avionique Electromagnetic position and orientation detector for a pilot's helmet
5119817, Nov 10 1987 MARKER, LLC Apparatus for imaging the anatomy
5142930, Nov 08 1989 MARKER, LLC Interactive image-guided surgical system
5143076, Dec 23 1988 Tyrone L., Hardy Three-dimensional beam localization microscope apparatus for stereotactic diagnoses or surgery
5152288, Sep 23 1988 Siemens Aktiengesellschaft Apparatus and method for measuring weak, location-dependent and time-dependent magnetic fields
5160337, Sep 24 1990 INTEGRA BURLINGTON MA, INC Curved-shaped floor stand for use with a linear accelerator in radiosurgery
5161536, Mar 22 1991 ECHO CATH, INC ; ECHO CATH, LTD Ultrasonic position indicating apparatus and methods
5178164, Nov 10 1987 MARKER, LLC Method for implanting a fiducial implant into a patient
5178621, Dec 10 1991 ZIMMER TECHNOLOGY, INC Two-piece radio-transparent proximal targeting device for a locking intramedullary nail
5186174, May 21 1987 PROF DR SCHLONDORFF, GEORGE Process and device for the reproducible optical representation of a surgical operation
5187475, Jun 10 1991 Honeywell Inc. Apparatus for determining the position of an object
5188126, Nov 16 1989 Surgical implement detector utilizing capacitive coupling
5190059, Nov 16 1989 Surgical implement detector utilizing a powered marker
5193106, Aug 28 1990 X-ray identification marker
5197476, Mar 16 1989 BANK OF MONTREAL Locating target in human body
5197965, Jul 29 1992 INTEGRA LIFESCIENCES CORPORATION Skull clamp pin assembly
5198768, Sep 27 1989 GENERAL ELECTRIC MEDICAL SYSTEMS ISRAEL LTD , AN ISRAEL CORPORATION AFFILIATED WITH GENERAL ELECTRIC COMPANY Quadrature surface coil array
5198877, Oct 15 1990 BANK OF MONTREAL Method and apparatus for three-dimensional non-contact shape sensing
5207688, Oct 31 1991 Best Medical International, Inc Noninvasive head fixation method and apparatus
5211164, Nov 10 1987 MARKER, LLC Method of locating a target on a portion of anatomy
5211165, Sep 03 1991 General Electric Company Tracking system to follow the position and orientation of a device with radiofrequency field gradients
5211176, Nov 30 1990 Fuji Photo Optical Co., Ltd. Ultrasound examination system
5212720, Jan 29 1992 Research Foundation-State University of N.Y. Dual radiation targeting system
5214615, Feb 26 1990 ACOUSTIC POSITIONING RESEARCH INC Three-dimensional displacement of a body with computer interface
5219351, Oct 24 1990 GENERAL ELECTRIC CGR S A Mammograph provided with an improved needle carrier
5222499, Nov 15 1989 MARKER, LLC Method and apparatus for imaging the anatomy
5224049, Apr 10 1990 Method, system and mold assembly for use in preparing a dental prosthesis
5228442, Feb 15 1991 Boston Scientific Scimed, Inc Method for mapping, ablation, and stimulation using an endocardial catheter
5230338, Nov 10 1987 MARKER, LLC Interactive image-guided surgical system for displaying images corresponding to the placement of a surgical tool or the like
5230623, Dec 10 1991 INTEGRA BURLINGTON MA, INC Operating pointer with interactive computergraphics
5233990, Jan 13 1992 Method and apparatus for diagnostic imaging in radiation therapy
5237996, Feb 11 1992 Cardiac Pathways Corporation Endocardial electrical mapping catheter
5249581, Jul 15 1991 BANK OF MONTREAL Precision bone alignment
5251127, Feb 01 1988 XENON RESEARCH, INC Computer-aided surgery apparatus
5251635, Sep 03 1991 General Electric Company Stereoscopic X-ray fluoroscopy system using radiofrequency fields
5253647, Apr 13 1990 Olympus Optical Co., Ltd. Insertion position and orientation state pickup for endoscope
5255680, Sep 03 1991 General Electric Company Automatic gantry positioning for imaging systems
5257636, Apr 02 1991 Steven J., White; Deborah O., White; WHITE, DEBORAH O , 145 ALPINE DRIVE, ROCHESTER, NY 14618 A CITIZEN OF USA Apparatus for determining position of an endothracheal tube
5257998, Sep 20 1989 Mitaka Kohki Co., Ltd.; Takaaki, Takizawa Medical three-dimensional locating apparatus
5261404, Jul 08 1991 Three-dimensional mammal anatomy imaging system and method
5265610, Sep 03 1991 General Electric Company Multi-planar X-ray fluoroscopy system using radiofrequency fields
5265611, Sep 23 1988 Siemens Aktiengellschaft Apparatus for measuring weak, location-dependent and time-dependent magnetic field
5269759, Jul 28 1992 Cordis Corporation Magnetic guidewire coupling for vascular dilatation apparatus
5271400, Apr 01 1992 General Electric Company Tracking system to monitor the position and orientation of a device using magnetic resonance detection of a sample contained within the device
5273025, Apr 13 1990 Olympus Optical Co., Ltd. Apparatus for detecting insertion condition of endoscope
5273039, Oct 16 1989 Olympus Optical Co., Ltd. Surgical microscope apparatus having a function to display coordinates of observation point
5274551, Nov 29 1991 General Electric Company Method and apparatus for real-time navigation assist in interventional radiological procedures
5279309, Jun 13 1991 International Business Machines Corporation Signaling device and method for monitoring positions in a surgical operation
5280427, Nov 27 1989 BARD INTERNATIONAL, INC ; Radi Medical Systems AB Puncture guide for computer tomography
5285787, Sep 12 1989 Kabushiki Kaisha Toshiba Apparatus for calculating coordinate data of desired point in subject to be examined
5291199, Jan 06 1977 Northrop Grumman Corporation Threat signal detection system
5291889, May 23 1991 VANGUARD IMAGING LTD A CORP OF DELAWARE Apparatus and method for spatially positioning images
5295483, May 11 1990 BANK OF MONTREAL Locating target in human body
5297549, Sep 23 1992 ST JUDE MEDICAL, DAIG DIVISION, INC ; ST JUDE MEDICAL, ATRIAL FIBRILLATION DIVISION, INC Endocardial mapping system
5299253, Apr 10 1992 PERIMMUNE HOLDINGS, INC Alignment system to overlay abdominal computer aided tomography and magnetic resonance anatomy with single photon emission tomography
5299254, Nov 24 1989 Technomed Medical Systems Method and apparatus for determining the position of a target relative to a reference of known co-ordinates and without a priori knowledge of the position of a source of radiation
5299288, May 11 1990 International Business Machines Corporation; Regents of the University of California Image-directed robotic system for precise robotic surgery including redundant consistency checking
5300080, Nov 01 1991 Stereotactic instrument guided placement
5305091, Dec 07 1992 Kodak Graphic Communications Canada Company Optical coordinate measuring system for large objects
5305203, Feb 01 1988 XENON RESEARCH, INC Computer-aided surgery apparatus
5306271, Mar 09 1992 IZI Medical Products, LLC Radiation therapy skin markers
5307072, Jul 09 1992 CHITTENDEN BANK Non-concentricity compensation in position and orientation measurement systems
5309913, Nov 30 1992 The Cleveland Clinic Foundation; CLEVELAND CLINIC FOUNDATION, THE Frameless stereotaxy system
5315630, Mar 11 1992 DEUTSCHES KREBSFORSCHUNGSZENTRUM Positioning device in medical apparatus
5316024, Jul 23 1992 Abbott Laboratories Tube placement verifier system
5318025, Apr 01 1992 General Electric Company Tracking system to monitor the position and orientation of a device using multiplexed magnetic resonance detection
5320111, Feb 07 1992 LIVINGSTON PRODUCTS, INC Light beam locator and guide for a biopsy needle
5325728, Jun 22 1993 Medtronic, Inc. Electromagnetic flow meter
5325873, Jul 23 1992 Abbott Laboratories; White's Electronics, Inc. Tube placement verifier system
5329944, Nov 16 1989 Surgical implement detector utilizing an acoustic marker
5330485, Nov 01 1991 Cerebral instrument guide frame and procedures utilizing it
5333168, Jan 29 1993 GE Medical Systems Global Technology Company, LLC Time-based attenuation compensation
5353795, Dec 10 1992 General Electric Company Tracking system to monitor the position of a device using multiplexed magnetic resonance detection
5353800, Dec 11 1992 University of Florida Research Foundation, Incorporated Implantable pressure sensor lead
5353807, Dec 07 1992 STEREOTAXIS, INC Magnetically guidable intubation device
5359417, Oct 18 1991 Carl-Zeiss-Stiftung Surgical microscope for conducting computer-supported stereotactic microsurgery and a method for operating the same
5368030, Sep 09 1992 IZI Medical Products, LLC Non-invasive multi-modality radiographic surface markers
5371778, Nov 29 1991 Picker International, Inc. Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images
5375596, Sep 29 1972 NEO MEDICAL INC Method and apparatus for determining the position of catheters, tubes, placement guidewires and implantable ports within biological tissue
5377678, Sep 03 1991 General Electric Company Tracking system to follow the position and orientation of a device with radiofrequency fields
5383454, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5385146, Jan 08 1993 NEW CHESTER INSURANCE COMPANY LIMITED Orthogonal sensing for use in clinical electrophysiology
5385148, Jul 30 1993 Regents of the University of California, The Cardiac imaging and ablation catheter
5386828, Dec 23 1991 SMITHS MEDICAL MD, INC Guide wire apparatus with location sensing member
5389101, Apr 21 1992 SOFAMOR DANEK HOLDINGS, INC Apparatus and method for photogrammetric surgical localization
5391199, Jul 20 1993 Biosense, Inc Apparatus and method for treating cardiac arrhythmias
5394457, Oct 08 1992 Leibinger GmbH Device for marking body sites for medical examinations
5394875, Oct 21 1993 MARKER, LLC Automatic ultrasonic localization of targets implanted in a portion of the anatomy
5397329, Nov 10 1987 MARKER, LLC Fiducial implant and system of such implants
5398684, Dec 23 1988 Method and apparatus for video presentation from scanner imaging sources
5399146, Dec 13 1993 Isocentric lithotripter
5400384, Jan 29 1993 GE Medical Systems Global Technology Company, LLC Time-based attenuation compensation
5402801, Nov 02 1993 International Business Machines Corporation System and method for augmentation of surgery
5408409, Sep 18 1991 International Business Machines Corporation Image-directed robotic system for precise robotic surgery including redundant consistency checking
5409497, Mar 11 1991 Siemens AG Orbital aiming device for mammo biopsy
5413573, May 24 1991 Onesys Oy Device for surgical procedures
5417210, May 27 1992 INTERNATIONAL BUSINESS MACHINES CORPORATION A CORP OF NEW YORK System and method for augmentation of endoscopic surgery
5419325, Jun 23 1994 General Electric Company Magnetic resonance (MR) angiography using a faraday catheter
5423334, Feb 01 1993 C R BARD, INC Implantable medical device characterization system
5425367, Sep 04 1991 CORPAK MEDSYSTEMS, INC Catheter depth, position and orientation location system
5425382, Sep 14 1993 University of Washington Apparatus and method for locating a medical tube in the body of a patient
5426683, Mar 14 1994 GE Medical Systems Global Technology Company, LLC One piece C-arm for X-ray diagnostic equipment
5426687, Jul 07 1992 INNOVATIVE CARE LTD Laser targeting device for use with image intensifiers in surgery
5427097, Dec 10 1992 PACIFIC REPUBLIC CAPITAL CORP Apparatus for and method of carrying out stereotaxic radiosurgery and radiotherapy
5429132, Aug 24 1990 Imperial College of Science Technology and Medicine Probe system
5433198, Mar 11 1993 CATHEFFECTS, INC Apparatus and method for cardiac ablation
5437277, Nov 18 1991 General Electric Company Inductively coupled RF tracking system for use in invasive imaging of a living body
5443066, Jan 29 1993 General Electric Company Invasive system employing a radiofrequency tracking system
5443489, Jul 20 1993 Biosense, Inc. Apparatus and method for ablation
5444756, Feb 09 1994 Minnesota Mining and Manufacturing Company X-ray machine, solid state radiation detector and method for reading radiation detection information
5445144, Dec 16 1993 Purdue Research Foundation Apparatus and method for acoustically guiding, positioning, and monitoring a tube within a body
5445150, Nov 18 1991 General Electric Company Invasive system employing a radiofrequency tracking system
5445166, Nov 02 1993 International Business Machines Corporation System for advising a surgeon
5446548, Oct 08 1993 National Research Council of Canada; Siemens Medical Systems, Inc Patient positioning and monitoring system
5447154, Jul 31 1992 UNIVERSITE JOSEPH FOURIER Method for determining the position of an organ
5448610, Feb 09 1993 Hitachi Medical Corporation Digital X-ray photography device
5453686, Apr 08 1993 CHITTENDEN BANK Pulsed-DC position and orientation measurement system
5456718, Nov 17 1992 Apparatus for detecting surgical objects within the human body
5457641, Jun 29 1990 Sextant Avionique Method and apparatus for determining an orientation associated with a mobile system, especially a line of sight inside a helmet visor
5458718, Mar 19 1993 VIP Industries Limited Heat sealing method for making a luggage case
5464446, Oct 12 1993 Medtronic, Inc Brain lead anchoring system
5469847, Sep 09 1992 IZI Medical Products, LLC Radiographic multi-modality skin markers
5478341, Dec 23 1991 ZIMMER TECHNOLOGY, INC Ratchet lock for an intramedullary nail locking bolt
5478343, Jun 13 1991 STRYKER TRAUMA GMBH, CORPORATION OF REPUBLIC OF GERMANY Targeting device for bone nails
5480422, Jul 20 1993 Biosense, Inc. Apparatus for treating cardiac arrhythmias
5480439, Feb 13 1991 Lunar Corporation Method for periprosthetic bone mineral density measurement
5483961, Mar 19 1993 COMPASS INTERNATIONAL, INC Magnetic field digitizer for stereotactic surgery
5485849, Jan 31 1994 EP Technologies, Inc. System and methods for matching electrical characteristics and propagation velocities in cardiac tissue
5487391, Jan 28 1994 EP Technologies, Inc. Systems and methods for deriving and displaying the propagation velocities of electrical events in the heart
5487729, Oct 08 1993 Cordis Corporation Magnetic guidewire coupling for catheter exchange
5487757, Jul 20 1993 Medtronic CardioRhythm Multicurve deflectable catheter
5490196, Mar 18 1994 RAPISCAN SYSTEMS OY Multi energy system for x-ray imaging applications
5494034, May 27 1987 Georg, Schlondorff Process and device for the reproducible optical representation of a surgical operation
5503416, Mar 10 1994 GE Medical Systems Global Technology Company, LLC Undercarriage for X-ray diagnostic equipment
5513637, Sep 29 1992 NEO MEDICAL INC Method and apparatus for determining the position of catheters, tubes, placement guidewires and implantable ports within biological tissue
5514146, Sep 17 1993 COMPUMEDICS GERMANY GMBH Device for accomodating at least one sonographic probe
5515160, Mar 12 1992 Aesculap AG Method and apparatus for representing a work area in a three-dimensional structure
5517990, Nov 30 1992 CLEVELAND CLINIC FOUNDATION, THE Stereotaxy wand and tool guide
5531227, Jan 28 1994 SCHNEIDER MEDICAL TECHNOLOGIES, INC Imaging device and method
5531520, Sep 01 1994 ENGILITY CORPORATION System and method of registration of three-dimensional data sets including anatomical body data
5542938, Jul 28 1992 Cordis Corporation Magnetic guidewire coupling for catheter exchange
5543951, Mar 15 1994 CCS Technology, Inc Method for receive-side clock supply for video signals digitally transmitted with ATM in fiber/coaxial subscriber line networks
5546940, Jan 28 1994 EP Technologies, Inc. System and method for matching electrical characteristics and propagation velocities in cardiac tissue to locate potential ablation sites
5546949, Apr 26 1994 Method and apparatus of logicalizing and determining orientation of an insertion end of a probe within a biotic structure
5546951, Jul 20 1993 Biosense, Inc. Method and apparatus for studying cardiac arrhythmias
5551429, Feb 12 1993 MARKER, LLC Method for relating the data of an image space to physical space
5558091, Oct 06 1993 Biosense, Inc Magnetic determination of position and orientation
5566681, May 02 1995 Apparatus and method for stabilizing a body part
5568384, Oct 13 1992 INTELLECTUAL VENTURES HOLDING 15 LLC Biomedical imaging and analysis
5568809, Jul 20 1993 Biosense, Inc. Apparatus and method for intrabody mapping
5572999, May 27 1992 International Business Machines Corporation Robotic system for positioning a surgical instrument relative to a patient's body
5573533, Apr 10 1992 Medtronic CardioRhythm Method and system for radiofrequency ablation of cardiac tissue
5575794, Feb 12 1973 MARKER, LLC Tool for implanting a fiducial marker
5575798, Nov 17 1989 Stereotactic device
5583909, Dec 20 1994 GE Medical Systems Global Technology Company, LLC C-arm mounting structure for mobile X-ray imaging system
5588430, Feb 14 1995 UNIVERSITY OF FLORIDA RESEARCH FOUNDATION, INC Repeat fixation for frameless stereotactic procedure
5590215, Oct 15 1993 MARKER, LLC Method for providing medical images
5592939, Jun 14 1995 SURGICAL NAVIGATION TECHNOLOGIES, INC Method and system for navigating a catheter probe
5595193, Jan 27 1994 MARKER, LLC Tool for implanting a fiducial marker
5596228, Mar 10 1994 GE Medical Systems Global Technology Company, LLC Apparatus for cooling charge coupled device imaging systems
5600330, Jul 12 1994 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Device for measuring position and orientation using non-dipole magnet IC fields
5603318, Apr 21 1992 SOFAMOR DANEK HOLDINGS, INC Apparatus and method for photogrammetric surgical localization
5611025, Nov 23 1994 General Electric Company Virtual internal cavity inspection system
5617462, Aug 07 1995 GE Medical Systems Global Technology Company, LLC Automatic X-ray exposure control system and method of use
5617857, Jun 06 1995 IMAGE GUIDED TECHNOLOGIES, INC Imaging system having interactive medical instruments and methods
5619261, Jul 25 1994 GE Medical Systems Global Technology Company, LLC Pixel artifact/blemish filter for use in CCD video camera
5622169, Sep 14 1993 Washington, University of Apparatus and method for locating a medical tube in the body of a patient
5622170, Oct 19 1990 IMAGE GUIDED TECHNOLOGIES, INC Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body
5627873, Aug 04 1995 GE Medical Systems Global Technology Company, LLC Mini C-arm assembly for mobile X-ray imaging system
5628315, Sep 15 1994 Brainlab AG Device for detecting the position of radiation target points
5630431, Jun 13 1991 International Business Machines Corporation System and method for augmentation of surgery
5636644, Mar 17 1995 Applied Medical Resources Corporation Method and apparatus for endoconduit targeting
5638819, Aug 29 1995 Method and apparatus for guiding an instrument to a target
5640170, Jun 05 1995 CHITTENDEN BANK Position and orientation measuring system having anti-distortion source configuration
5642395, Aug 07 1995 GE Medical Systems Global Technology Company, LLC Imaging chain with miniaturized C-arm assembly for mobile X-ray imaging system
5643268, Sep 27 1994 Brainlab AG Fixation pin for fixing a reference system to bony structures
5645065, Sep 04 1991 CORPAK MEDSYSTEMS, INC Catheter depth, position and orientation location system
5646524, Jun 16 1992 ELBIT SYSTEMS LTD Three dimensional tracking system employing a rotating field
5647361, Dec 18 1992 Fonar Corporation Magnetic resonance imaging method and apparatus for guiding invasive therapy
5662111, Jan 28 1991 INTEGRA RADIONICS, INC Process of stereotactic optical navigation
5664001, Mar 24 1995 J MORITA MANUFACTURING CORPORATION; Hamamatsu Photonics Kabushiki Kaisha Medical X-ray imaging apparatus
5674296, Nov 14 1994 MEDTRONIC SOFAMOR DANEK, INC Human spinal disc prosthesis
5676673, Sep 13 1995 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system with error detection for use in medical applications
5681260, Sep 22 1989 Olympus Optical Co., Ltd. Guiding apparatus for guiding an insertable body within an inspected object
5682886, Dec 26 1995 C A S T , L L C Computer-assisted surgical system
5682890, Jan 26 1995 Picker International, Inc.; CLEVELAND CLINIC FOUNDATION, THE Magnetic resonance stereotactic surgery with exoskeleton tissue stabilization
5690108, Nov 28 1994 OHIO STATE UNIVERSITY, THE Interventional medicine apparatus
5694945, Jul 20 1993 Biosense, Inc. Apparatus and method for intrabody mapping
5695500, Nov 02 1993 International Business Machines Corporation System for manipulating movement of a surgical instrument with computer controlled brake
5695501, Sep 30 1994 SCHAERER MEDICAL USA, INC Apparatus for neurosurgical stereotactic procedures
5697377, Nov 22 1995 Medtronic, Inc Catheter mapping system and method
5702406, Sep 15 1994 Brainlab AG Device for noninvasive stereotactic immobilization in reproducible position
5711299, Jan 26 1996 GN RESOUND A S Surgical guidance method and system for approaching a target within a body
5713946, Jul 20 1993 Biosense, Inc. Apparatus and method for intrabody mapping
5715822, Sep 28 1995 General Electric Company Magnetic resonance devices suitable for both tracking and imaging
5715836, Feb 16 1993 Brainlab AG Method and apparatus for planning and monitoring a surgical operation
5718241, Jun 07 1995 Biosense, Inc Apparatus and method for treating cardiac arrhythmias with no discrete target
5727552, Jan 11 1996 Medtronic, Inc. Catheter and electrical lead location system
5727553, Mar 25 1996 Catheter with integral electromagnetic location identification device
5729129, Jun 07 1995 Biosense, Inc Magnetic location system with feedback adjustment of magnetic field generator
5730129, Apr 03 1995 General Electric Company Imaging of interventional devices in a non-stationary subject
5730130, Feb 12 1993 MARKER, LLC Localization cap for fiducial markers
5732703, Nov 30 1992 The Cleveland Clinic Foundation; CLEVELAND CLINIC FOUNDATION, THE Stereotaxy wand and tool guide
5735278, Mar 15 1996 IMRIS, INC Surgical procedure with magnetic resonance imaging
5738096, Jul 20 1993 Biosense, Inc Cardiac electromechanics
5740802, Apr 20 1993 General Electric Company Computer graphic and live video system for enhancing visualization of body structures during surgery
5741214, Dec 20 1993 Terumo Kabushiki Kaisha Accessory pathway detecting/cauterizing apparatus
5742394, Jun 14 1996 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Optical 6D measurement system with two fan shaped beams rotating around one axis
5744953, Aug 29 1996 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Magnetic motion tracker with transmitter placed on tracked object
5748767, Aug 10 1988 XENON RESEARCH, INC Computer-aided surgery apparatus
5749362, May 27 1992 International Business Machines Corporation Method of creating an image of an anatomical feature where the feature is within a patient's body
5749835, Sep 06 1994 SMITHS MEDICAL MD, INC Method and apparatus for location of a catheter tip
5752513, Jun 07 1995 Biosense, Inc Method and apparatus for determining position of object
5755725, Sep 07 1993 SARIF BIOMEDICAL LLC Computer-assisted microsurgery methods and equipment
5758667, Jan 26 1995 Siemens Elema AB Device for locating a port on a medical implant
5762064, Jan 23 1995 Northrop Grumman Systems Corporation Medical magnetic positioning system and method for determining the position of a magnetic probe
5767669, Jun 14 1996 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Magnetic field position and orientation measurement system with dynamic eddy current rejection
5767960, Jun 14 1996 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Optical 6D measurement system with three fan-shaped beams rotating around one axis
5769789, Feb 12 1993 MARKER, LLC Automatic technique for localizing externally attached fiducial markers in volume images of the head
5769843, Feb 20 1996 CorMedica Percutaneous endomyocardial revascularization
5769861, Sep 28 1995 Brainlab AG Method and devices for localizing an instrument
5772594, Oct 16 1996 SOFAMOR DANEK HOLDINGS, INC Fluoroscopic image guided orthopaedic surgery system with intraoperative registration
5775322, Jun 27 1996 LUCENT MEDICAL SYSTEMS, INC Tracheal tube and methods related thereto
5776064, Nov 30 1992 The Cleveland Clinic Foundation Frameless stereotaxy system for indicating the position and axis of a surgical probe
5782765, Apr 25 1996 DROGO IP LLC Medical positioning system
5787886, Mar 19 1993 COMPASS INTERNATIONAL, INC Magnetic field digitizer for stereotatic surgery
5792055, Mar 18 1994 Schneider (USA) Inc. Guidewire antenna
5795294, May 21 1994 Carl-Zeiss-Stiftung Procedure for the correlation of different coordinate systems in computer-supported, stereotactic surgery
5797849, Mar 28 1995 Sonometrics Corporation Method for carrying out a medical procedure using a three-dimensional tracking and imaging system
5799055, May 15 1996 Northwestern University Apparatus and method for planning a stereotactic surgical procedure using coordinated fluoroscopy
5799099, Feb 12 1993 MARKER, LLC Automatic technique for localizing externally attached fiducial markers in volume images of the head
5800352, Sep 13 1995 GE Medical Systems Global Technology Company, LLC Registration system for use with position tracking and imaging system for use in medical applications
5800535, Feb 09 1994 The University of Iowa Research Foundation Wireless prosthetic electrode for the brain
5802719, Mar 14 1994 GE Medical Systems Global Technology Company, LLC One piece C-arm for X-ray diagnostic equipment
5803089, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
5807252, Feb 23 1995 AESCULAP AG & CO KG Method and apparatus for determining the position of a body part
5810008, Dec 03 1996 Brainlab AG Apparatus and method for visualizing ultrasonic images
5810728, Apr 03 1993 U.S. Philips Corporation MR imaging method and apparatus for guiding a catheter
5810735, Feb 27 1995 Medtronic, Inc. External patient reference sensors
5820553, Aug 16 1996 Siemens Medical Systems, Inc. Identification system and method for radiation therapy
5823192, Jul 31 1996 University of Pittsburgh of the Commonwealth System of Higher Education Apparatus for automatically positioning a patient for treatment/diagnoses
5823958, Nov 26 1990 ARTMA MEDICAL TECHNOLOGIES AG System and method for displaying a structural data image in real-time correlation with moveable body
5828725, Jul 03 1996 Eliav Medical Imaging Systems LTD Processing images for removal of artifacts
5828770, Feb 20 1996 BANK OF MONTREAL System for determining the spatial position and angular orientation of an object
5829444, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
5831260, Sep 10 1996 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Hybrid motion tracker
5833608, Oct 06 1993 Biosense, Inc. Magnetic determination of position and orientation
5834759, May 22 1997 Philips Electronics Ltd Tracking device having emitter groups with different emitting directions
5836954, Apr 21 1992 SOFAMOR DANEK HOLDINGS, INC Apparatus and method for photogrammetric surgical localization
5840024, Oct 18 1993 Olympus Optical Co., Ltd. Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
5840025, Jul 20 1993 Biosense, Inc. Apparatus and method for treating cardiac arrhythmias
5843076, Jun 12 1996 CORDIS WEBSTER, INC Catheter with an electromagnetic guidance sensor
5848967, Jan 28 1991 Sherwood Services AG Optically coupled frameless stereotactic system and method
5851183, Oct 19 1990 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
5865846, Nov 14 1994 Human spinal disc prosthesis
5868674, Nov 24 1995 U S PHILIPS CORPORATION MRI-system and catheter for interventional procedures
5868675, Oct 05 1989 Medtronic, Inc Interactive system for local intervention inside a nonhumogeneous structure
5871445, Apr 26 1993 ST LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
5871455, Apr 30 1996 Nikon Corporation Ophthalmic apparatus
5871487, Jun 24 1994 NEUROTECH USA, INC Microdrive for use in stereotactic surgery
5873822, Sep 13 1995 GE Medical Systems Global Technology Company, LLC Automatic registration system for use with position tracking and imaging system for use in medical applications
5882304, Oct 27 1997 Picker Nordstar Corporation Method and apparatus for determining probe location
5884410, Dec 21 1995 Carl Zeiss Industrielle Messtechnik GmbH Sensing system for coordinate measuring equipment
5889834, Sep 28 1995 BANK OF AMERICA, N A , AS ADMINISTRATIVE AGENT Blade collimator for radiation therapy
5891034, Oct 19 1990 ST LOUIS UNIVERSITY System for indicating the position of a surgical probe within a head on an image of the head
5891157, Sep 30 1994 SCHAERER MEDICAL USA, INC Apparatus for surgical stereotactic procedures
5904691, Sep 30 1996 Picker International, Inc.; The Cleveland Clinic Foundation Trackable guide block
5907395, Jun 06 1997 Image Guided Technologies, Inc. Optical fiber probe for position measurement
5913820, Aug 14 1992 British Telecommunications public limited company Position location system
5920395, Apr 22 1993 IMAGE GUIDED TECHNOLOGIES, INC System for locating relative positions of objects in three dimensional space
5921992, Apr 11 1997 INTEGRA RADIONICS, INC Method and system for frameless tool calibration
5923727, Sep 30 1997 Siemens Medical Solutions USA, Inc Method and apparatus for calibrating an intra-operative X-ray system
5928248, Feb 25 1997 Biosense, Inc Guided deployment of stents
5938603, Dec 01 1997 CORDIS WEBSTER, INC Steerable catheter with electromagnetic sensor
5938694, Nov 10 1993 Medtronic CardioRhythm Electrode array catheter
5947980, Sep 30 1993 Price Invena ApS Device for squeezing and cutting an umbilical cord
5947981, Jan 31 1995 INTEGRA BURLINGTON MA, INC Head and neck localizer
5950629, Nov 02 1993 International Business Machines Corporation System for assisting a surgeon during surgery
5951475, Sep 25 1997 Integrated Surgical Systems Methods and apparatus for registering CT-scan data to multiple fluoroscopic images
5951571, Sep 19 1996 Brainlab AG Method and apparatus for correlating a body with an image of the body
5954647, Feb 14 1995 UNIVERSITY OF FLORIDA RESEARCH FOUNDATION, INC Marker system and related stereotactic procedure
5957844, Dec 03 1996 Brainlab AG Apparatus and method for visualizing ultrasonic images
5964796, Sep 24 1993 Boston Scientific Scimed, Inc Catheter assembly, catheter and multi-port introducer for use therewith
5967980, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
5967982, Dec 09 1997 The Cleveland Clinic Foundation Non-invasive spine and bone registration for frameless stereotaxy
5968047, Apr 05 1996 Solana Surgical, LLC Fixation devices
5971997, Feb 03 1995 INTEGRA RADIONICS, INC Intraoperative recalibration apparatus for stereotactic navigators
5976156, Jun 13 1991 International Business Machines Corporation Stereotaxic apparatus and method for moving an end effector
5980535, Sep 30 1996 CLEVELAND CLINIC FOUNDATION, THE; PICKER INTERNATIONAL, INC Apparatus for anatomical tracking
5983126, Nov 22 1995 Medtronic, Inc. Catheter location system and method
5987349, Oct 19 1990 Image Guided Technologies, Inc. Method for determining the position and orientation of two moveable objects in three-dimensional space
5987960, Sep 26 1997 MAKO SURGICAL CORP Tool calibrator
5999837, Sep 26 1997 MAKO SURGICAL CORP Localizing and orienting probe for view devices
5999840, Sep 01 1994 TASC, INC ; BRIGHAM & WOMEN S HOSPITAL, INC , THE System and method of registration of three-dimensional data sets
6001130, Nov 14 1994 MEDTRONIC SOFAMOR DANEK, INC Human spinal disc prosthesis with hinges
6006126, Jan 28 1991 INTEGRA RADIONICS, INC System and method for stereotactic registration of image scan data
6006127, Feb 28 1997 U S PHILIPS CORPORATION Image-guided surgery system
6013087, Sep 26 1996 BANK OF MONTREAL Image-guided surgery system
6014580, Nov 12 1997 STEREOTAXIS, INC Device and method for specifying magnetic field for surgical applications
6016439, Oct 15 1996 Biosense, Inc Method and apparatus for synthetic viewpoint imaging
6019725, Mar 28 1995 Sonometrics Corporation Three-dimensional tracking and imaging system
6024695, Nov 02 1993 International Business Machines Corporation System and method for augmentation of surgery
6050724, Jan 31 1997 U. S. Philips Corporation Method of and device for position detection in X-ray imaging
6059718, Oct 18 1993 Olympus Optical Co., Ltd. Endoscope form detecting apparatus in which coil is fixedly mounted by insulating member so that form is not deformed within endoscope
6063022, Jan 03 1997 Biosense, Inc. Conformal catheter
6071288, Sep 30 1994 SCHAERER MEDICAL USA, INC Apparatus and method for surgical stereotactic procedures
6073043, Dec 22 1997 CorMedica Measuring position and orientation using magnetic fields
6076008, Apr 26 1993 St. Louis University System for indicating the position of a surgical probe within a head on an image of the head
6096050, Mar 19 1999 Brainlab AG Method and apparatus for correlating a body with an image of the body
6104944, Nov 17 1997 SURGICAL NAVIGATION TECHNOLOGIES, INC System and method for navigating a multiple electrode catheter
6118845, Jun 29 1998 Medtronic Navigation, Inc System and methods for the reduction and elimination of image artifacts in the calibration of X-ray imagers
6122538, Jan 16 1997 Siemens Medical Solutions USA, Inc Motion--Monitoring method and system for medical devices
6122541, May 04 1995 INTEGRA BURLINGTON MA, INC Head band for frameless stereotactic registration
6131396, Sep 27 1996 Siemens Healthcare GmbH Heat radiation shield, and dewar employing same
6139183, Oct 17 1997 Siemens Healthcare GmbH X-ray exposure system for 3D imaging
6147480, Oct 23 1997 Biosense, Inc Detection of metal disturbance
6149592, Nov 26 1997 Picker International, Inc.; PICKER INTERNATIONAL, INC Integrated fluoroscopic projection image data, volumetric image data, and surgical device position data
6156067, Nov 14 1994 MEDTRONIC SOFAMOR DANEK, INC Human spinal disc prosthesis
6161032, Mar 30 1998 Biosense, Inc Three-axis coil sensor
6165181, Apr 21 1992 SOFAMOR DANEK HOLDINGS, INC Apparatus and method for photogrammetric surgical localization
6167296, Jun 28 1996 CICAS IP LLC Method for volumetric image navigation
6172499, Oct 29 1999 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Eddy current error-reduced AC magnetic position measurement system
6175756, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
6178345, Jun 30 1998 Brainlab AG Method for detecting the exact contour of targeted treatment areas, in particular, the external contour
6194639, May 01 1996 UNIVERSITY OF QUEENSLAND, THE; Golden Circle Limited; STATE OF QUEENSLAND, THE ACC synthase genes from pineapple
6201387, Oct 07 1997 Biosense, Inc Miniaturized position sensor having photolithographic coils for tracking a medical probe
6203497, Dec 03 1996 Brainlab AG Apparatus and method for visualizing ultrasonic images
6211666, Feb 27 1996 Biosense, Inc. Object location system and method using field actuation sequences having different field strengths
6223067, Apr 11 1997 Brainlab AG Referencing device including mouthpiece
6233476, May 18 1999 ST JUDE MEDICAL INTERNATIONAL HOLDING S À R L Medical positioning system
6246231, Jul 29 1999 Ascension Technology Corporation; ROPER ASCENSION ACQUISITION, INC Magnetic field permeable barrier for magnetic position measurement system
6259942, Sep 27 1997 Brainlab AG Method and apparatus for recording a three-dimensional image of a body part
6273896, Apr 21 1998 Neutar, LLC Removable frames for stereotactic localization
6285902, Feb 10 1999 STRYKER EUROPEAN HOLDINGS I, LLC Computer assisted targeting device for use in orthopaedic surgery
6298262, Apr 21 1998 NEUTAR L L C Instrument guidance for stereotactic surgery
6314310, Mar 31 1997 Biosense, Inc. X-ray guided surgical location system with extended mapping volume
6332089, Aug 26 1996 Biosense, Inc. Medical procedures and apparatus using intrabody probes
6341231, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
6351659, Sep 12 1996 Brainlab AG Neuro-navigation system
6381485, Oct 28 1999 Medtronic Navigation, Inc Registration of human anatomy integrated for electromagnetic localization
6424856, Jun 30 1998 Brainlab AG Method for the localization of targeted treatment areas in soft body parts
6427314, Oct 06 1993 Biosense, Inc. Magnetic determination of position and orientation
6428547, Nov 25 1999 Brainlab AG Detection of the shape of treatment devices
6434415, Oct 19 1990 St. Louis University; Surgical Navigation Technologies, Inc. System for use in displaying images of a body part
6437567, Dec 06 1999 General Electric Company Radio frequency coil for open magnetic resonance imaging system
6445943, Sep 15 1994 GE Medical Systems Global Technology Company, LLC Position tracking and imaging system for use in medical applications
6470207, Mar 23 1999 Medtronic Navigation, Inc Navigational guidance via computer-assisted fluoroscopic imaging
6474341, Oct 28 1999 Medtronic Navigation, Inc Surgical communication and power system
6478802, Jun 09 2000 GE Medical Systems Global Technology Company, LLC Method and apparatus for display of an image guided drill bit
6484049, Apr 28 2000 STRYKER EUROPEAN HOLDINGS I, LLC Fluoroscopic tracking and visualization system
6490475, Apr 28 2000 STRYKER EUROPEAN HOLDINGS I, LLC Fluoroscopic tracking and visualization system
6493573, Oct 28 1999 SURGICAL NAVIGATION TECHNOLOGIES, INC Method and system for navigating a catheter probe in the presence of field-influencing objects
6498944, Feb 01 1996 Biosense, Inc. Intrabody measurement
6499488, Oct 28 1999 SURGICAL NAVIGATION TECHNOLOGIES, INC Surgical sensor
6516046, Nov 04 1999 Brainlab AG Exact patient positioning by compairing reconstructed x-ray images and linac x-ray images
6527443, Apr 20 1999 Brainlab AG Process and apparatus for image guided treatment with an integration of X-ray detection and navigation system
6609022, Jan 12 2000 Brainlab AG Intraoperative navigation updating
6611700, Dec 30 1999 Brainlab AG Method and apparatus for positioning a body for radiation using a position sensor
6701179, Oct 28 1999 SURGICAL NAVIGATION TECHNOLOGIES, INC Coil structures and methods for generating magnetic fields
20010007918,
20020095081,
20040024309,
CA964149,
DE19715202,
DE19747427,
DE19751761,
DE19832296,
DE3042343,
DE3508730,
DE3717871,
DE3831278,
DE3838011,
DE4213426,
DE4225112,
DE4233978,
EP62941,
EP119660,
EP155857,
EP319844,
EP326768,
EP350996,
EP419729,
EP427358,
EP456103,
EP581704,
EP651968,
EP655138,
EP894473,
EP908146,
EP930046,
FR2417970,
GB2094590,
GB2164856,
JP3267054,
JP6194639,
JP62327,
JP63240851,
RE32619, Oct 17 1984 Apparatus and method for nuclear magnetic resonance scanning and mapping
RE35025, Jan 07 1991 OEC MEDICAL SYSTEMS, INC Battery enhanced power generation for mobile X-ray machine
RE35816, Mar 30 1995 BANK OF MONTREAL Method and apparatus for three-dimensional non-contact shape sensing
WO9151,
WO130437,
WO8809151,
WO8905123,
WO9005494,
WO9103982,
WO9104711,
WO9107726,
WO9203090,
WO9206645,
WO9404938,
WO9423647,
WO9424933,
WO9507055,
WO9611624,
WO9632059,
WO9736192,
WO9749453,
WO9808554,
WO9838908,
WO9915097,
WO9921498,
WO9923956,
WO9926549,
WO9927839,
WO9929253,
WO9933406,
WO9937208,
WO9938449,
WO9952094,
WO9960939,
WO2618211,
/
Executed onAssignorAssigneeConveyanceFrameReelDoc
Oct 05 1990Medtronic Navigation, Inc.(assignment on the face of the patent)
Date Maintenance Fee Events


Date Maintenance Schedule
Jan 29 20164 years fee payment window open
Jul 29 20166 months grace period start (w surcharge)
Jan 29 2017patent expiry (for year 4)
Jan 29 20192 years to revive unintentionally abandoned end. (for year 4)
Jan 29 20208 years fee payment window open
Jul 29 20206 months grace period start (w surcharge)
Jan 29 2021patent expiry (for year 8)
Jan 29 20232 years to revive unintentionally abandoned end. (for year 8)
Jan 29 202412 years fee payment window open
Jul 29 20246 months grace period start (w surcharge)
Jan 29 2025patent expiry (for year 12)
Jan 29 20272 years to revive unintentionally abandoned end. (for year 12)