The shape and orientation of rigid or nearly rigid moveable bodies are determined using a shape characterization. Sensors capture a plurality of representations of different perspectives of the body that are analyzed to determine a bounding volume of the body. The shape of the body is determined from the bounding volume. The position of the body is determined using tracking devices that sense the position of the body. The bounding volume and position information are combined to define the shape and orientation in space of the body, and in particular the position of a point of interest on the body.
|
35. A method of determining the shape of a body using a computer navigation system, the method comprising the steps of:
generating a series of representations of the body;
generating edge information from at least one of the representations;
estimating one or more bounding pyramids from the edge information;
determining a composite bounding volume of the body from the one or more bounding pyramids;
determining the shape of the body from the composite bounding volume; and
determining a position and an orientation of the body using a tracking device associated with the body that communicates with the computer navigation system.
46. A method of determining the shape and orientation of a body using a computer navigation system, the method comprising the steps of:
generating a series of representations of the body from at least two perspectives;
determining a composite bounding volume of the body from the series of representations;
estimating the shape of the body from the composite bounding volume;
comparing the estimated shape of the body to stored shape information using comparison metrics;
refining the estimated shape based on the comparison; and
determining the position and orientation of the body from the shape of the body and the series of representations of the body.
1. A system for calibrating a tracked shape and orientation of a body comprising:
a computer navigation system;
a sensing device adapted to generate a series of representations of a shape of the body;
a tracking device associated with the body in a fixed relation to the body, the tracking device adapted to be detected by the computer navigation system to locate the body relative to the computer navigation system; and
the computer navigation system having a central processing unit that is adapted to process the series of representations and a relative location of the body to the computer navigation system to determine the shape and orientation of the body relative to the tracking device, whereby the shape and orientation of the body is calibrated to the tracking device.
21. A system for calibrating a tracked shape and orientation of a body comprising:
a computer navigation system;
a sensing device adapted to generate a series of representations of a shape of the body based on relative movement between the body and the sensing device;
an emitter associated with the body in a fixed relation to the body, the emitter adapted to be detected by the computer navigation system to locate the body relative to the computer navigation system; and
the computer navigation system having a central processing unit that is adapted to process the series of representations and a relative location of the body to the computer navigation system to determine the shape and orientation of the body relative to the emitter, whereby the shape and orientation of the body is calibrated to the emitter.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
9. The system of
15. The system of
16. The system of
17. The system of
18. The system of
19. The system of
20. The system of
23. The system of
25. The system of
31. The system of
33. The system of
34. The system of
36. The method of
37. The method of
38. The method of
39. The method of
40. The system of
41. The system of
42. The method of
43. The method of
44. The method of
45. The method of
47. The method of
48. The method of
|
Not applicable
Not applicable
Not applicable
1. Field of the Invention
This invention relates to the determination of the shape of rigid or nearly rigid bodies. More particularly, this invention relates to shape determination of such bodies using a computer navigation system.
2. Description of the Background of the Invention
Computer determination of the location of bodies has been used in manufacturing and medical fields for a number of years. Computer navigation requires that the bodies to be tracked by the navigation system have a known shape, so that the orientation and position of the bodies can be properly tracked by the system. Tracking is accomplished by either attaching a tracking device to the body or embedding the tracking device into the body. There are numerous tracking technologies including active and passive optical tracking systems, magnetic systems and inertial systems.
For many applications it is necessary to field calibrate bodies so that the navigation system can thereafter track the body and realistically render the body graphically on a computer display. Typically, this is done by attaching the tracking device in a fixed relation with the body and then inserting the body into a calibration device. These devices can be as simple as a divot in a known relation to the navigation system or can be a device that constrains the body in a predetermined attitude relative to the navigation system with the tip of the body located in a predetermined position. Current tracking calibration requires some physical contact between the body and a calibration device.
For certain situations, it may be desirable to minimize contact with other devices or bodies. For instance in a surgical setting, sterility requirements require that the body to be used be sterile and that every body it contacts in any way also be sterile. This necessitates sterilizing the calibration device and maintaining the calibration device within the sterile field. With space at a premium in a surgical suite, this can be a problem.
In addition, bodies that include attachments, such as screwdrivers, drills, implant insertion devices, etc., need to be recalibrated each time a new attachment is inserted. Lastly, some devices do not have an axial shape with the result that these bodies have been difficult to field calibrate using known methods.
According to one aspect of the invention, a system determines the shape and orientation of a body relative to a tracking device. A sensing device generates a series of representation of the body. A tracking device capable of being detected by a computer navigation system is associated with the body such that the position of the body is located relative to the computer navigation system. The computer navigation system having a central processing unit processes the series of representations of the body and the relative location of the body in order to determine the shape and orientation of the body relative to the tracking device.
In accordance with another aspect of the invention, the shape and orientation of a body relative to an emitter are determined by a system. A sensing device generates a series of representation of the body. An emitter capable of being detected by a computer navigation system is associated with the body such that the position of the body is located relative to the computer navigation system. The computer navigation system having a central processing unit processes the series of representations of the body and the relative location of the body in order to determine the shape and orientation of the body relative to the emitter.
In accordance with a further aspect of the invention, a method to determine the shape and orientation of a body relative to a tracking device using a computer navigation system includes the step of generating a series of representation of the body and thereafter using these representations to determine a composite bounding volume of the body. The shape of the body is determined using the composite bounding volume. A position and an orientation of the body is determined using a tracking device associated with the body that communicates with the computer navigation system.
In yet a further aspect of the invention, a method to determine the shape and orientation of a body using a computer navigation system includes the step of generating a series of representations of the body from at least two perspectives. A composite bounding volume is determined from the series of representations and the shape of the body is determined from the composite bounding volume. The position and orientation of the body are determined from the shape of the body and the series of representations of the body.
The position tracking device 106 has a local coordinate system 120, and each of the cameras 108-1 through 108-M have their own local coordinate systems 122-1 through 122-M. Suitable devices for use as the cameras 108-1 through 108-M include known digital video cameras, digital still cameras, image capture devices and the like.
The position tracking device 106 has a predetermined and fixed relationship to the body 102 and is calibrated to the computer navigation system 104. Furthermore, the position tracking device 106 is capable of tracking the position of a fixed point 124 on the surface of the body 102 with respect to either the coordinate system 120 of the position tracking device 106 or with respect to the coordinate system 112 of the navigation computer 104 because the two coordinate systems are calibrated to one another. The calibration of the two coordinate systems enables any measurements of the point 124 on the body 102 with respect to the coordinate system 120 of the position tracking device 106 to be mapped to the coordinate system 112 of the navigation computer 104 through a linear transformation.
The position tracking device 106 can be physically separate from the body 102, or alternatively, the position tracking device 106 can be attached to or otherwise incorporated into the body 102 and still provide the necessary position information. The point 124 can be located in a fixed position relative to the position tracking device 106 or can be determined by a calibration method described hereinafter. The point 124 can be the location of an emitter used by the position tracking device 106, as is discussed hereinafter.
The position tracking device 106 can be one of any of a number of position sensing devices known to those familiar with the art.
Although only one point 124 is depicted as being tracked on the surface of the body 102 to simplify the description, it should be evident that multiple tracking points may be tracked on the same body 102, each with a separate position-tracking device. In fact, multiple tracking points may be necessary to determine the full rotational orientation of the body 102. Multiple bodies 102 can also be tracked at the same time by a single system.
Referring once again to
The plurality of cameras 108-1 through 108-M positioned around the body capture images of the body from different perspectives. These cameras 108-1 through 108-M may be either fixed image cameras or video cameras, or some combination of the two camera technologies. If video cameras are used then individual frames of the video captured by the video camera are processed as single images. Preferably, all of the cameras capture frames nearly synchronously in time so that images from multiple view points are correlated. The positions and coordinate systems 122-1 through 122-M of the cameras 108-1 and 108-M are calibrated to one another and to the global coordinate system 112 established by the navigation computer 104. One embodiment of the method of calibration of the cameras is described herein below. In the preferred embodiment the cameras 108-1 through 108-M are standard video cameras with frame capture hardware in desktop personal computers or Firewire and USB based cameras that are well known in the art.
Fixed backgrounds 110-1 through 110-N preferably are positioned around the body opposite the cameras. These backgrounds 110-1 through 110-N provide a known surround in an image captured by the cameras 108-1 through 108-M that aids in identifying the edges of the body 102 in the image. The backgrounds 110-1 through 110-M may be neutral, black, white, or any color that would increase the contrast between the portion of the image that represents the background 110-1 through 110-M and the portion of the image that represents the body 102. Further, the backgrounds may be backlit to further increase this contrast. It is possible to perform one embodiment of the method of the present invention without fixed backgrounds. However, this is not preferred because of the increased complexity of the shape determination from having to subtract the background image from the image of the body 102.
The navigation computer 104 processes the images captured by the cameras 108-1 through 108-M. The navigation computer 104 may make use of a body database 130 populated with shape information regarding typical bodies that the navigation computer 104 may have to identify. The shape information of a body in the body database 130 is preferably coordinates of vertex points of the body as are typically be available from a computer aided design system. The navigation computer 104 develops one or more comparison metrics by comparing the bounding volume estimated from processing the images from the cameras 108-1 through 108-M to the shape information that is stored in the body database 130. If the shape information for one of the bodies in the body database 130 is found to be highly correlated with the estimated bounding volume, the navigation computer may use the shape information for the body to refine the estimated bounding volume. For example, the navigation computer may develop a comparison metric by analyzing the distances between each vertex of the estimated bounding volume and a corresponding vertex stored as part of the shape information for a body in the body database 130. An example of another comparison metric that may be developed is the result of analyzing the properties of the inertia moment axes of the estimated bounding volume with the inertia moment axes of a body in the body database 130. Additional comparison metrics are known to those familiar with the art. A preferred embodiment uses a plurality of comparison metrics to determine the degree of correlation between the estimated bounding volume and a body stored in the body database 130.
It is not necessary for the cameras 108-1 through 108-M to image the entire body. Only the portion of the body 102 that is of interest needs to be imaged by the cameras 108-1 through 108-M. Furthermore, the body 102 and the cameras 108-1 through 108-M are preferably positioned with respect to one another so that the field of view of each camera captures approximately the same parts of the body.
An embodiment to calibrate the coordinate systems 122-1 through 122-M of the cameras 108-1 through 108-M of the shape characterization system 100 with respect to each other and with respect to the coordinate system 112 of the computer navigation system is through the use of a calibration body with an exactly known shape.
The shadow sensing devices can be calibrated using a body of known shape and dimension. A representative body 700 that can be used for calibration is depicted in
The algorithms used to estimate the shape of the body 102 can be any of those well known and used in the field of computer graphics. Such algorithms are described in publications used in the field such as Computer Graphics: Principles and Practice, by James D. Foley, et al (Addison-Wesley, 1990), which is incorporated herein by reference. From the shape of the body 102 determined, the system can then determine the location of the tip 126.
If at least two sensing devices (either cameras 108 or shadow sensing devices 904) are used then the emitter 124 and the position tracking device 106 are not necessary, because the image of the body (or the shadow of the body) for one of the multiple devices provides information about the relative position of the body 102 with respect to the other devices. This information can be used to deduce the position of the body 102 with respect to the coordinate system 112 of the navigation computer 104 by, for example, stereographically determining multiple homologous point pairs in at least two camera views of the body 102. This is because the position of the sensing devices (either 108 or 904) with respect to the coordinate system 112 of the navigation computer 104 is known and tracked during the operation of the shape characterization system and linear transformation can be used to map between the coordinate systems of the sensing devices 108 or 904 and the navigation computer 104.
In addition, to further enhance the reality of the body 102 as it is displayed on a display monitor, coloration and or texture can also optionally be created by known methods. In this case, one or more light sources 128 optionally can be simulated to shade the rendered view of the body 102 on a computer graphics screen.
Malackowski, Donald W., Schulz, Waldean A., Moctezuma de La Barrera, José Luis
Patent | Priority | Assignee | Title |
10064687, | Jan 13 2014 | Brainlab AG | Estimation and compensation of tracking inaccuracies |
10531926, | May 23 2016 | MAKO Surgical Corp.; MAKO SURGICAL CORP | Systems and methods for identifying and tracking physical objects during a robotic surgical procedure |
10828125, | Nov 03 2015 | SYNAPTIVE MEDICAL INC | Dual zoom and dual field-of-view microscope |
11011267, | Sep 18 2013 | Hill-Rom Services, Inc. | Bed/room/patient association systems and methods |
11291507, | Jul 16 2018 | MAKO SURGICAL CORP | System and method for image based registration and calibration |
11707329, | Aug 10 2018 | Covidien LP | Systems and methods for ablation visualization |
11806090, | Jul 16 2018 | MAKO Surgical Corp. | System and method for image based registration and calibration |
11826208, | Nov 03 2015 | SYNAPTIVE MEDICAL INC. | Dual zoom and dual field-of-view microscope |
11911325, | Feb 26 2019 | Hill-Rom Services, Inc | Bed interface for manual location |
7961942, | Mar 16 2005 | FUJIFILM Corporation | Apparatus and method for generating catalog image and program therefor |
8482743, | Nov 01 2004 | COGNITENS LTD | Method and system for optical edge measurement |
8750568, | May 22 2012 | Covidien LP | System and method for conformal ablation planning |
9161799, | Jan 28 2013 | WARSAW ORTHPEDIC, INC | Surgical implant system and method |
9439622, | May 22 2012 | Covidien LP | Surgical navigation system |
9439623, | May 22 2012 | Covidien LP | Surgical planning system and navigation system |
9439627, | May 22 2012 | Covidien LP | Planning system and navigation system for an ablation procedure |
9498182, | May 22 2012 | Covidien LP | Systems and methods for planning and navigation |
9830424, | Sep 18 2013 | Hill-Rom Services, Inc | Bed/room/patient association systems and methods |
9950194, | Sep 09 2014 | Mevion Medical Systems, Inc.; MEVION MEDICAL SYSTEMS, INC | Patient positioning system |
9956020, | Jan 28 2013 | Warsaw Orthopedic, Inc. | Surgical implant system and method |
Patent | Priority | Assignee | Title |
3942522, | Oct 19 1973 | National Research Development Corporation | Surgical splints and materials therefor |
4346717, | Sep 07 1979 | Siemens Aktiengesellschaft | Device for punctuating internal body organs, vessels or the like |
4370554, | Sep 27 1979 | International Business Machines Corporation | Alignment system for particle beam lithography |
4416019, | Oct 12 1979 | U.S. Philips Corporation | Device for producing images of a layer of an object from multiple shadow images with varying degrees of overlap |
4461016, | Oct 03 1979 | U.S. Philips Corporation | Method of and device for forming an image of a layer of a three-dimensional object |
4567896, | Jan 20 1984 | ELSCINT, INC , A CORP OF MA | Method and apparatus for calibrating a biopsy attachment for ultrasonic imaging apparatus |
4673352, | Jan 10 1985 | Device for measuring relative jaw positions and movements | |
4722056, | Feb 18 1986 | Trustees of Dartmouth College | Reference display systems for superimposing a tomagraphic image onto the focal plane of an operating microscope |
4757379, | Apr 14 1986 | WRIGHT, DANIEL W | Apparatus and method for acquisition of 3D images |
4836778, | May 26 1987 | Vexcel Corporation | Mandibular motion monitoring system |
4873651, | Apr 21 1987 | Case Western Reserve University | Method and apparatus for reconstructing three-dimensional surfaces from two-dimensional images |
4908656, | Jan 21 1988 | Nikon Corporation | Method of dimension measurement for a pattern formed by exposure apparatus, and method for setting exposure conditions and for inspecting exposure precision |
4972836, | Dec 18 1989 | General Electric Company | Motion detector for high-resolution magnetic resonance imaging |
5050608, | Jul 12 1988 | MIZUHO IKAKOGYO CO , LTD | System for indicating a position to be operated in a patient's body |
5142930, | Nov 08 1989 | MARKER, LLC | Interactive image-guided surgical system |
5155435, | Aug 08 1991 | REGENTS OF THE UNIVERSITY OF CALIFORNIA, THE A CORPORATION OF CA | Method and apparatus for performing interventional medical procedures using MR imaging of interventional device superimposed with ghost patient image |
5172331, | Dec 18 1989 | Fujitsu Semiconductor Limited | Apparatus and method for effecting exposure of sample to charged particle beam |
5186174, | May 21 1987 | PROF DR SCHLONDORFF, GEORGE | Process and device for the reproducible optical representation of a surgical operation |
5197476, | Mar 16 1989 | BANK OF MONTREAL | Locating target in human body |
5198877, | Oct 15 1990 | BANK OF MONTREAL | Method and apparatus for three-dimensional non-contact shape sensing |
5206893, | Jun 30 1989 | GE Yokogawa Medical Systems, Ltd | Radiotherapeutic apparatus having three dimensional light marks |
5207681, | Oct 26 1987 | NEURODYNAMICS, INC | Drill guide apparatus for perpendicular perforation of the cranium |
5222499, | Nov 15 1989 | MARKER, LLC | Method and apparatus for imaging the anatomy |
5230623, | Dec 10 1991 | INTEGRA BURLINGTON MA, INC | Operating pointer with interactive computergraphics |
5251127, | Feb 01 1988 | XENON RESEARCH, INC | Computer-aided surgery apparatus |
5276337, | Oct 31 1991 | International Business Machines Corporation; INTERNATIONAL BUSINESS MACHINES CORPORATION A CORPORATION OF NEW YORK | Accuracy of alignment and O/L measurement systems by means of tunable source and handling of signal |
5299288, | May 11 1990 | International Business Machines Corporation; Regents of the University of California | Image-directed robotic system for precise robotic surgery including redundant consistency checking |
5305203, | Feb 01 1988 | XENON RESEARCH, INC | Computer-aided surgery apparatus |
5309913, | Nov 30 1992 | The Cleveland Clinic Foundation; CLEVELAND CLINIC FOUNDATION, THE | Frameless stereotaxy system |
5365996, | Jun 10 1992 | AMEI TECHNOLOGIES INC , A DELAWARE CORPORATION | Method and apparatus for making customized fixation devices |
5383454, | Oct 19 1990 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
5389101, | Apr 21 1992 | SOFAMOR DANEK HOLDINGS, INC | Apparatus and method for photogrammetric surgical localization |
5393988, | Nov 04 1988 | Fujitsu Semiconductor Limited | Mask and charged particle beam exposure method using the mask |
5394875, | Oct 21 1993 | MARKER, LLC | Automatic ultrasonic localization of targets implanted in a portion of the anatomy |
5400428, | May 13 1992 | THE SPECTRANETICS CORPORATION; POLYMICRO TECHNOLOGIES, INC | Method and apparatus for linearly scanning energy over an optical fiber array and coupler for coupling energy to the optical fiber array |
5412811, | Oct 04 1991 | Carl-Zeiss-Stiftung | Headgear having a holding device for holding an instrument |
5419320, | Oct 26 1990 | Hitachi, Ltd. | Method and apparatus for obtaining an image indicating metabolism in a body |
5422491, | Nov 04 1988 | Fujitsu Semiconductor Limited | Mask and charged particle beam exposure method using the mask |
5447154, | Jul 31 1992 | UNIVERSITE JOSEPH FOURIER | Method for determining the position of an organ |
5512946, | Jan 31 1994 | Hitachi Denshi Kabushiki Kaisha | Digital video signal processing device and TV camera device arranged to use it |
5517990, | Nov 30 1992 | CLEVELAND CLINIC FOUNDATION, THE | Stereotaxy wand and tool guide |
5564437, | Dec 15 1992 | UNIVERSITY JOSEPH FOURIER | Method and system for determining the fixation point on the femur of a crossed ligament of the knee |
5591207, | Mar 30 1995 | Linvatec Corporation | Driving system for inserting threaded suture anchors |
5617857, | Jun 06 1995 | IMAGE GUIDED TECHNOLOGIES, INC | Imaging system having interactive medical instruments and methods |
5622170, | Oct 19 1990 | IMAGE GUIDED TECHNOLOGIES, INC | Apparatus for determining the position and orientation of an invasive portion of a probe inside a three-dimensional body |
5637866, | May 05 1994 | Apparatus and method for optically detecting and electronically analyzing the location of a projectile in a target plane | |
5662111, | Jan 28 1991 | INTEGRA RADIONICS, INC | Process of stereotactic optical navigation |
5676673, | Sep 13 1995 | GE Medical Systems Global Technology Company, LLC | Position tracking and imaging system with error detection for use in medical applications |
5682890, | Jan 26 1995 | Picker International, Inc.; CLEVELAND CLINIC FOUNDATION, THE | Magnetic resonance stereotactic surgery with exoskeleton tissue stabilization |
5690108, | Nov 28 1994 | OHIO STATE UNIVERSITY, THE | Interventional medicine apparatus |
5697368, | May 07 1994 | Carl-Zeiss-Stiftung | Process for the operation of an operation microscope |
5706811, | Nov 30 1994 | Director-General of Agency of Industrial Science and Technology | Method and apparatus for setting reference point for organic measurement |
5732703, | Nov 30 1992 | The Cleveland Clinic Foundation; CLEVELAND CLINIC FOUNDATION, THE | Stereotaxy wand and tool guide |
5740222, | Nov 26 1993 | Kabushiki Kaisha Toshiba | Radiation computed tomography apparatus |
5748696, | Nov 26 1993 | Kabushiki Kaisha Toshiba | Radiation computed tomography apparatus |
5772594, | Oct 16 1996 | SOFAMOR DANEK HOLDINGS, INC | Fluoroscopic image guided orthopaedic surgery system with intraoperative registration |
5787886, | Mar 19 1993 | COMPASS INTERNATIONAL, INC | Magnetic field digitizer for stereotatic surgery |
5795294, | May 21 1994 | Carl-Zeiss-Stiftung | Procedure for the correlation of different coordinate systems in computer-supported, stereotactic surgery |
5797924, | Nov 02 1993 | Loma Linda University Medical Center | Stereotactic fixation system and calibration phantom |
5807256, | Mar 01 1993 | Kabushiki Kaisha Toshiba | Medical information processing system for supporting diagnosis |
5848126, | Nov 26 1993 | Kabushiki Kaisha Toshiba | Radiation computed tomography apparatus |
5848967, | Jan 28 1991 | Sherwood Services AG | Optically coupled frameless stereotactic system and method |
5851183, | Oct 19 1990 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
5855553, | Feb 16 1995 | Hitachi, LTD | Remote surgery support system and method thereof |
5876325, | Nov 02 1993 | Olympus Optical Co., Ltd. | Surgical manipulation system |
5878103, | Jun 30 1997 | Siemens Medical Solutions USA, Inc | Adaptive detector masking for speed-up of cone beam reconstruction |
5880846, | Jul 09 1997 | YEDA RESEARCH AND DEVELOPMENT CO LTD | Method and apparatus for color-coded optical profilometer |
5891034, | Oct 19 1990 | ST LOUIS UNIVERSITY | System for indicating the position of a surgical probe within a head on an image of the head |
5921992, | Apr 11 1997 | INTEGRA RADIONICS, INC | Method and system for frameless tool calibration |
6006126, | Jan 28 1991 | INTEGRA RADIONICS, INC | System and method for stereotactic registration of image scan data |
6021343, | Nov 20 1997 | Medtronic Navigation, Inc | Image guided awl/tap/screwdriver |
6081336, | Sep 26 1997 | MAKO SURGICAL CORP | Microscope calibrator |
6112113, | Jul 03 1997 | U S PHILIPS CORPORATION | Image-guided surgery system |
6167295, | Jan 28 1991 | INTEGRA BURLINGTON MA, INC | Optical and computer graphic stereotactic localizer |
6175415, | Feb 19 1997 | United Technologies Corporation | Optical profile sensor |
6181815, | Feb 25 1997 | NEC Corporation | Subject image extraction device |
6226003, | Aug 11 1998 | Microsoft Technology Licensing, LLC | Method for rendering silhouette and true edges of 3-D line drawings with occlusion |
6285902, | Feb 10 1999 | STRYKER EUROPEAN HOLDINGS I, LLC | Computer assisted targeting device for use in orthopaedic surgery |
6301498, | Apr 17 1998 | Cornell Research Foundation, Inc | Method of determining carotid artery stenosis using X-ray imagery |
6306126, | Sep 18 1998 | STRYKER EUROPEAN HOLDINGS III, LLC | Calibrating device |
6317139, | Mar 25 1998 | Method and apparatus for rendering 3-D surfaces from 2-D filtered silhouettes | |
6356272, | Aug 29 1996 | HANGER SOLUTIONS, LLC | Texture information giving method, object extracting method, three-dimensional model generating method and apparatus for the same |
6442416, | Apr 22 1993 | Image Guided Technologies, Inc.; IMAGE GUIDED TECHNOLOGIES, INC | Determination of the position and orientation of at least one object in space |
6455835, | Apr 04 2001 | LinkedIn Corporation | System, method, and program product for acquiring accurate object silhouettes for shape recovery |
6512844, | May 30 1997 | California Institute of Technology | 3D rendering |
6529192, | Jul 20 1998 | GEOMETRIX INCORPORATED | Method and apparatus for generating mesh models of 3D objects |
6535219, | Mar 30 2000 | Intel Corporation | Method and apparatus to display objects in a computer system |
6567156, | Aug 31 2000 | SARIN TECHNOLOGIES LTD | Apparatus and method for examining the shape of gemstones |
6592033, | Aug 10 1999 | Ajax Cooke Pty Ltd | Item recognition method and apparatus |
6662036, | Jan 28 1991 | INTEGRA BURLINGTON MA, INC | Surgical positioning system |
6788062, | Jul 25 2002 | STRYKER EUROPEAN HOLDINGS III, LLC | Correcting geometry and intensity distortions in MR data |
6788827, | Sep 30 1999 | Koninklijke Philips Electronics N V | Image processing method and system for following a moving object from one image to an other image sequence |
6792074, | Mar 05 2001 | Brainlab AG | Method for producing or updating radiotherapy plan |
20030164953, | |||
20030195526, | |||
20040013305, | |||
20040170247, | |||
20040170308, | |||
20040171922, | |||
20040175034, | |||
20040181144, | |||
20040181149, | |||
20060036148, | |||
EP501812, | |||
EP535552, | |||
EP1189537, | |||
EP1340470, | |||
EP1354564, | |||
JP1236046, | |||
JP1245108, | |||
JP1288250, | |||
JP2600627, | |||
JP3032649, | |||
JP3057466, | |||
JP3155837, | |||
JP3193040, | |||
JP3210245, | |||
JP373113, | |||
JP4161145, | |||
JP5007554, | |||
JP5049644, | |||
JP5111886, | |||
JP5184554, | |||
JP55110539, | |||
JP5573253, | |||
JP5581640, | |||
JP5581641, | |||
JP5594244, | |||
JP5645649, | |||
JP57021250, | |||
JP57122862, | |||
JP57195447, | |||
JP58010, | |||
JP60185538, | |||
JP6019710, | |||
JP6038975, | |||
JP6063033, | |||
JP6125531, | |||
JP6131129, | |||
JP6149950, | |||
JP6173308, | |||
JP62057784, | |||
JP6205793, | |||
JP6251038, | |||
JP6353511, | |||
JP6359610, | |||
JP7194616, | |||
JP7236633, | |||
JP7255723, | |||
JP7303651, | |||
JP7308303, | |||
JP7313527, | |||
JP7323035, | |||
JP7328016, | |||
JP753160, | |||
JP8010266, | |||
JP8024233, | |||
JP8038439, | |||
JP8038506, | |||
JP8038507, | |||
JP8107893, | |||
JP8112240, | |||
JP8150129, | |||
JP8173449, | |||
JP8215211, | |||
JP8224255, | |||
JP8238248, | |||
JP8238257, | |||
JP8275206, | |||
JP9019441, | |||
WO4506, | |||
WO100092, | |||
WO111553, | |||
WO9611624, | |||
WO9632059, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 04 2005 | Stryker Leibinger GmbH & Co. KG. | (assignment on the face of the patent) | / | |||
Feb 28 2005 | DE LA BARRERA, JOSE LUIS MOCTEZUMA | STRYKER LEIBINGER GMBH & CO KG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017026 | /0772 | |
Mar 03 2005 | SCHULZ, WALDEAN A | STRYKER LEIBINGER GMBH & CO KG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017026 | /0772 | |
Mar 04 2005 | MALACKOWSKI, DONALD W | STRYKER LEIBINGER GMBH & CO KG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017026 | /0772 | |
Oct 08 2015 | STRYKER LEIBINGER GMBH & CO KG | STRYKER EUROPEAN HOLDINGS VI, LLC | NUNC PRO TUNC ASSIGNMENT SEE DOCUMENT FOR DETAILS | 037152 | /0910 | |
Oct 08 2015 | STRYKER EUROPEAN HOLDINGS VI, LLC | STRYKER EUROPEAN HOLDINGS I, LLC | NUNC PRO TUNC ASSIGNMENT SEE DOCUMENT FOR DETAILS | 037153 | /0391 | |
Feb 26 2019 | STRYKER EUROPEAN HOLDINGS III, LLC | Stryker European Operations Holdings LLC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 052860 | /0716 | |
May 19 2020 | STRYKER EUROPEAN HOLDINGS I, LLC | STRYKER EUROPEAN HOLDINGS III, LLC | NUNC PRO TUNC ASSIGNMENT SEE DOCUMENT FOR DETAILS | 052861 | /0001 |
Date | Maintenance Fee Events |
Mar 08 2013 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
May 11 2017 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
May 12 2021 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Nov 24 2012 | 4 years fee payment window open |
May 24 2013 | 6 months grace period start (w surcharge) |
Nov 24 2013 | patent expiry (for year 4) |
Nov 24 2015 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 24 2016 | 8 years fee payment window open |
May 24 2017 | 6 months grace period start (w surcharge) |
Nov 24 2017 | patent expiry (for year 8) |
Nov 24 2019 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 24 2020 | 12 years fee payment window open |
May 24 2021 | 6 months grace period start (w surcharge) |
Nov 24 2021 | patent expiry (for year 12) |
Nov 24 2023 | 2 years to revive unintentionally abandoned end. (for year 12) |