A medical technical system computing unit for calculating a multi-dimensional image dataset from subject signals supplied to the computing unit and obtained from the examination of a subject with at least one pre-operative imaging method. A virtual, multi-dimensional image that can be calculated from the multi-dimensional image dataset via the computing unit can be displayed on a display unit. A controllable window can be displayed at the display unit via a window generator allocated to the computing unit. The image at the display means corresponding to a virtual location of the controllable window controllable is produced via a control unit allocated to the computing unit.
|
1. A medical-technical system comprising:
an imaging system operable according to a pre-operative imaging modality to obtain subject signals from an examination subject; a computing unit supplied with said subject signals for calculating a multi-dimensional image dataset from said subject signals; a display unit connected to said computing unit for displaying a virtual, multi-dimensional image produced in said computing unit from said multi-dimensional image dataset; a window generator connected to said computing unit for generating a controllable window displayed at said display unit; and a control unit connected to said computing unit for controlling production of said virtual, multi-dimensional image in said computing unit by including a virtual location of said window in said virtual, multi-dimensional image, said virtual location being controlled by said control unit so that said window displayed at said display unit contains a portion of said virtual, multi-dimensional image surrounded by said window at said virtual location.
2. A medical-technical system as claimed in
3. A medical-technical system as claimed in
4. A medical-technical system as claimed in
5. A medical-technical system as claimed in
6. A medical-technical system as claimed in
7. A medical-technical system as claimed in
8. A medical-technical system as claimed in
9. A medical-technical system as claimed in
10. A medical-technical system as claimed in
11. A medical-technical system as claimed in
12. A medical-technical system as claimed in
13. A medical-technical system as claimed in
14. A medical-technical system as claimed in
15. A medical-technical system as claimed in
16. A medical-technical system as claimed in
17. A medical-technical system as claimed in
18. A medical-technical system as claimed in
19. A medical-technical system as claimed in
20. A medical-technical system as claimed in
21. A medical-technical system as claimed in
22. A medical-technical system as claimed in
23. A medical-technical system as claimed in
24. A medical-technical system as claimed in
25. A medical-technical system as claimed in
26. A medical-technical system as claimed in
27. A medical-technical system as claimed in
28. A medical-technical system as claimed in
29. A medical-technical system as claimed in
30. A medical-technical system as claimed in
31. A medical-technical system as claimed in
32. A medical-technical system as claimed in
33. A medical-technical system as claimed in
34. A medical-technical system as claimed in
35. A medical-technical system as claimed in
36. A medical-technical system as claimed in
37. A medical-technical system as claimed in
38. A medical-technical system as claimed in
39. A medical-technical system as claimed in
40. A medical-technical system as claimed in
41. A medical-technical system as claimed in
42. A medical-technical system as claimed in
43. A medical-technical system as claimed in
44. A medical-technical system as claimed in
45. A medical-technical system as claimed in
46. A medical-technical system as claimed in
47. A medical-technical system as claimed in
48. A medical-technical system as claimed in
49. A medical-technical system as claimed in
50. A medical-technical system as claimed in
51. A medical-technical system as claimed in
52. A medical-technical system as claimed in
53. A medical-technical system as claimed in
54. A medical-technical system as claimed in
55. A medical-technical system as claimed in
|
1. Field of the Invention
The present invention is directed to a medical-technical apparatus of the type having at least one computing unit for calculating a multi-dimensional image dataset of a subject with subject signals obtained with at least one preoperative imaging method and supplied to the computing unit.
2. Description of the Prior Art
A virtual multi-dimensional image can be calculated from the multi-dimensional image dataset by the computing unit and can be displayed at a display unit following the computing unit. Such medical-technical systems can be x-ray systems, ultrasound systems, magnetic resonance systems or some other imaging systems and are well known. Window generators for producing a controllable window displayable at the display unit are known particularly in computed tomography and magnetic resonance systems. These windows serve the purpose of characterizing a region of interest by magnification of the voxel values in this window. Further, endoscopic and laparoscopic imaging methods are known that are intracorporeally or intra-operatively utilized in order to obtain real time images that can be displayed either directly and/or via a display unit. Further, examination subjects can be treated and/or examined with endoscopes and/or laparoscopes.
An object of the present invention is to provide a medical-technical system of the type initially described which makes a number of information presentations, particularly images, available to the user for the examination or treatment of an examination subject.
The above object is achieved in accordance with the principles of the present invention in a medical-technical system having an imaging system which is operable according to a pre-operative (e.g. non-invasive) imaging modality to obtain subject signals from an examination subject, a computing unit supplied with the subject signals which calculates a multi-dimensional dataset therefrom, a display unit connected to the calculating unit which displays a virtual, multi-dimensional image produced in the calculating unit from the multi-dimensional image dataset, a window generator connected to the computing unit which generates a controllable window displayed at the display unit, and a control unit connected to the computing unit for controlling the production of the virtual, multi-dimensional image in the computing unit by including a virtual location of the window in the virtual, multi-dimensional image, the virtual location of the window being controlled by the control unit.
An advantage of the invention is that the computing unit has a control unit allocated to it for controlling the image output on the basis of the multi-dimensional image dataset corresponding to a virtual location of the controllable window at the display unit that can be controlled via the control unit. In addition to the information derived from the virtual multi-dimensional image, thus the information derived from the control of the window dependent on the virtual location can also be used. According to the invention, consequently, it is possible, for example, to display a two-dimensional image at the display unit and, given the presence of, for example, a three-dimensional image dataset, to adjust the window, for example, into the "depth"of the image, so that image information that lie outside the plane of the two-dimensional image also are displayed.
The computing unit can calculate a multi-dimensional image dataset on the basis of the examination of a subject with subject signals, obtained with at least one preoperative (e.g. non-invasive) imaging method, and one further imaging method, supplied to the computing unit. The image output can be effected on the basis of subject signals obtained with the preoperative or with the further imaging method, at the display unit in the window corresponding to a virtual location of the controllable window controllable via the control unit. In addition to the information from the preoperatively acquired subject signals, thus, it is possible, for example, to additionally mix the information into the window that were obtained by the further imaging method. For example, the image information obtained from a magnetic resonance method, ultrasound method endoscopic or laparoscopic method can be displayed in the window in an x-ray image as the pre-operatively imaging method.
It is advantageous when at least two overlapping, pre-operative imaging methods are utilized for producing a multi-dimensional image dataset in the examination of a subject, since image information that are distinct from one another thus can be made available to the examining person.
It is advantageous to produce a multi-dimensional image dataset for each pre-operative imaging method. It is also advantageous to manipulate the multi-dimensional image datasets via the computing unit to form a further multi-dimensional image dataset. In a further embodiment, on the basis of the multi-dimensional image datasets, a virtual multi-dimensional image or a manipulated virtual multi-dimensional image can be displayed at the display unit. The information available to the examining person is increased compared to a single imaging method. The multi-dimensional image datasets and the images are three-dimensional, so that a spatial depth and/or positional identification of, for example, organs or vessels is also possible.
In an embodiment wherein a number of windows are controllable via the control unit and wherein an image output corresponding to each virtual location of the respective window ensues on the basis of the multi-dimensional image dataset, then a region of interest or a location of interest in the virtual image can be viewed from different directions.
A magnified (enlarged) display of the information in-the controllable window is possible in an embodiment wherein the image output can be effected at a further display area corresponding to the virtual location of the controllable window, controllable via the control unit.
In an embodiment wherein a virtual, three-dimensional image output ensues according to the control of the window via one or more virtual channels, then the information can be made visibly available not only at the end face of the channel in the window but also can be made visibly available to the examining person at the edge region thereof.
In order to make the spatial orientation of the controllable window more displayable to the examining person, in a further embodiment a virtual, three-dimensional image of the examined subject is produced via the computing unit on the basis of the multi-dimensional image dataset and the virtual location of the controllable window is displayable in the virtual three-dimensional image at the display unit. This is also particularly true in a version wherein the virtual channel of the controllable window is displayed in the virtual, three-dimensional image of the examined subject because the course of the channel is then visible for the examining person.
In an embodiment wherein the computing unit has an instrument generating unit allocated to it for generating a display of at least one virtual instrument at the display unit and wherein the instrument generating unit can be influenced via the control unit for controlling the display of the virtual instrument in view of a virtual location in the virtual three-dimensional image, then, for example, the instrument guidance can be pre-planned in a planned examination or treatment of the subject.
In an embodiment wherein signals corresponding to the control of the virtual instrument can be generated and supplied to a controllable robot arm allocated to the computing unit, and wherein the robot arm is controlled corresponding to the control of the virtual instrument, then a treatment or examination of the subject assisted by the robot arm can be implemented via the inventive medical-technical system.
In an embodiment wherein the computing unit is supplied with subject signals generated with at least one intra-operative imaging method and wherein signals are displayed at the display unit as a real-time image, then the examining person or treating person not only receives the information produced by the (at least one) pre-operative imaging method, but also receives the image information that can be generated directly and immediately at the location and site of the treatment or examination. For example, it is thus possible to make current, supplementary information obtained with the intra-operative imaging method on the basis of, for example, the electrical conversion of physical signals, particularly optical and/or acoustic signals and/or radiation, available to the examining person or treating person at the display unit. This is particularly true when endoscopes or laparoscopes are employed for this purpose.
When endoscopes and/or laparoscopes are used for the intra-operative imaging and/or as examination instruments and/or as treatment instruments, then it is advantageous to provide means for real-time location detection and a location generator for generating a location mark at the display unit corresponding to the real-time location of the endoscope, laparoscope and/or the instrument in the virtual image, since the current position can thus always be displayed in the virtual, preferably three-dimensional, image at the display unit. Additionally, it is advantageous to display the displacement path of the endoscope, the laparoscope and/or the instrument in the virtual three-dimensional image as a channel that can be generated by the location generator.
For preparing a diagnosis, and in particular for planning treatment, it is advantageous to provide means for determining the spatial coordinates and/or the spacings between two locations identified in the virtual and/or real image. For example, the distance between an instrument and a vessel or the distance between the subject being sought and neighboring tissue or a neighboring organ can be determined or the spatial attitude can be defined.
In an embodiment wherein the control unit has an evaluation unit allocated to it for evaluating at least one voxel in the window dependent or its voxel value, the evaluation unit generating a signal when the voxel value lies within or beyond a prescribable range and/or under and/or over a prescribable voxel value. Then, for example when planning a treatment, a treatment channel (path) can be identified that is oriented such that it does not proceed through a vessel or an organ whose voxel values define, for example, the prescribable area. A number of voxels in the window preferably can be utilized for this purpose, allowing the diameter of the treatment channel also to be defined. It is advantageous when, on the basis of the signal of the evaluation means, a direction-linked control of the window is enabled or suppressed. The window thus can be guided only into subject areas wherein there is no risk that the channel unintentionally proceeds through a vessel or an organ.
In an embodiment having means for producing a connecting line between a first location and a second location in the virtual, multi-dimensional image, then, for example, a connecting line can be drawn between a planned entry opening for the introduction of an instrument into the body of the subject as the first location and the region to be examined or treated as the second location, this line representing the channel for the introduction of the instrument. It is especially advantageous in conjunction therewith to undertake a calculation for the connecting line, proceeding from the first location to the second location taking at least one prescribable voxel value into consideration, since this will again insure no risk of the channel unintentionally proceeding through an organ or a vessel. When the second location can be defined by the means for real-time location detection, then, for example proceeding from the location at which the endoscope or the laparoscope is located, a channel to the exterior surface of the examination subject can be calculated that is defined by the connecting line, which can serve as introduction aid for a further instrument.
Within the framework of the invention, the computing unit can also calculate a third location taking the aforementioned at least one voxel value into consideration, this third location being optimally close to the first location. By displaying the connecting line at the display unit, the third location, which is better suited for avoiding damage and/or injury of organs or vessels, can be defined in the proximity of the first location, proceeding from a desired entry location into the subject that corresponds to the first location.
In an embodiment wherein an examination and/or treatment instrument has an instrument locator allocated to it for generating an instrument location signal, and wherein the instrument location signal is supplied to the computing unit, which generates a signal when the instrument is adjusted onto/or deviating from the connecting line, then this signal makes it possible to track whether the examination and/or treatment instrument is being guided on the previously calculated, optimum treatment channel to the location in the subject to be treated or to be examined. When a number of examination and/or treatment instruments are employed, then it is advantageous for a number of first and/or second and/or third locations to be prescribed or calculated.
In an embodiment wherein the endoscope or the laparoscope makes use of an ultrasound imaging method, then, in addition to the real-time optical image signals, the imaging signals of the ultrasound method can also supply further information for the examining person or treating person during the examination or treatment.
The subject signals can be obtained from at least one pre-operative imaging method and at least one further imaging method, the subject signals obtained with the further imaging method preferably being output as an image in the window 6 corresponding to a virtual location of the controllable window 6 at the display unit 4. Preferably, an image output can ensue by switching onto the subject signal received by the pre-operative imaging or by the further imaging method. This is particularly meaningful when the subject 2 is examined with two different pre-operative imaging methods or a further imaging method distinct therefrom.
A multi-dimensional image dataset can be stored in the memory 10 for each pre-operative imaging method and each further imaging method and/or the subject signals obtained by the imaging method can be manipulated to form a further multi-dimensional image dataset via the computing unit 1. On the basis of the multi-dimensional image datasets, it is thus possible to display a virtual multi-dimensional image or a manipulated, virtual multi-dimensional image at the display unit 4 (FIG. 3).
In order to be able to display the spatial conditions of the organs of the subject 2 optimally well in their arrangement relative to one another, it is advantageous for the multi-dimensional image datasets and at least the one virtual multi-dimensional image to be three-dimensional, or three-dimensionally displayed (FIG. 4).
As can be seen from the Figures, a number of windows 6 are controllable via the control unit 9, with an image output corresponding to every virtual location of the respective window 6 ensuing on the basis of the multi-dimensional image dataset. A region to be examined or to be treated in the subject 2 thus can be viewed from different directions in the virtual three-dimensional image. The computing unit 1 can have a further display unit 11 allocated to it at which, for example, an image output can be generated corresponding to the control of the window 6. Alternatively, a corresponding display area can also be allocated to the display unit 4. The provision of a further display unit 11 has the advantage that the image information of the window can be displayed on a larger area. It is also advantageous to effect the image output corresponding to the control of the window 6 according to a virtual channel 12 in the virtual three-dimensional image, since structures at the "wall" of the channel thus can also be displayed and can be recognized by the examining person. Of course, a number of virtual channels 12 corresponding to the control of respective windows 6 can also be displayed at the display unit 4 or at the further display unit 11.
As also shown in
The computing unit 1 also can be supplied with the subject signals that can be generated by an intra-operative imaging method, these signals likewise being displayed at the display unit 4 or at the further display unit 11 and providing the examining person or treating person with additional, real-time image information. An intra-operative imaging method is especially advantageous that is based on the electrical conversion of physical signals, particularly optical and/or acoustic signals, and/or radiation. Endoscopy and/or laparoscopy and/or ultrasound are particularly suited as such intra-operative imaging methods, particularly when the real-time image that can thus be produced and the virtual, multi-dimensional image can be displayed at the display unit 4, or the further display unit 11, via the computing unit 1 in combined form. The real-time image, for example, can be displayed on an additional display unit that is not shown.
To make information about the spatial arrangement in the subject 2 of the endoscope and/or laparoscope and/or the ultrasound applicator and/or an instrument 23, the computing unit 1 has a locator 24 for real-time location detection of the endoscope and/or laparoscope and/or the instrument 23 on the basis of beam attenuation, or location-dependent influences on ultrasound, optical, magnetic and/or electrical signals. A location generator 15 generates a location mark OM in the virtual image corresponding to the real-time location of the endoscope and/or laparoscope and/or the ultrasound applicator and/or the instrument 23 at the display unit 4 (FIG. 6). Not only the location but also, using a line generator 16 for generating a connecting line VL, the adjustment path VW of the endoscope and/or the laparoscope and/or the ultrasound applicator and/or the instrument 23 can be displayed by the location generator 15 as a channel or line in the virtual three-dimensional image. This allows the introduction channel of the endoscope and/or the laparoscope and/or the instrument 23 to be closely tracked in the virtual multi-dimensional image (FIG. 6). When the location mark OM is displayed as a location window at the display unit 4 or at the further display unit 11 and when the corresponding real-time image of the endoscope and/or laparoscope is displayed in this location window, then the treating person or examining person, in addition to obtaining the spatial arrangement of the endoscope and/or laparoscope from the virtual three-dimensional image, is shown not only location information but also a real-time image of the location (FIG. 7).
It is also advantageous (
For planning an intervention with an endoscope and/or laparoscope and/or the ultrasound applicator and/or some other instrument 23, it is advantageous for the control unit 9 to have an evaluation unit 17 allocated to it for evaluating at least one voxel V1-VN in the window 6 dependent on whether its voxel value W lies within or outside a prescribable area and/or under and/or over and/or equal to a prescribable voxel threshold Wsw (
It is advantageous for planning the intervention for the computing unit 1 to have a line generator 16 allocated to it for generating a connecting line VL between a first location S1 and a second location S2 or between first and second locations in the virtual multi-dimensional image (FIG. 11). The connecting line VL can represent a possible intervention channel. In a preferred development, the connecting line VL is calculated via the computing unit 1 proceeding from a first location S1 that, for example, defines a planned entry location for the introduction of the endoscope and/or laparoscope and/or ultrasound applicator and/or an instrument 23 into the subject 2, taking at least one prescribable voxel value W into consideration, to the second location S2 that defines a region of interest or a location of interest. As already explained, the intervention channel can be predetermined such that optimally no vessel or organ is damaged and/or penetrated. The first location S1, for example, can be defined by setting a mark in the virtual multi-dimensional image. The second location S2 can likewise be defined by setting a further mark or by the location detector 24. Via the computing unit 1, the connecting line VL can be calculated taking the (at least one) voxel value W into consideration and can be displayed in the virtual multi-dimensional image at the display unit 4. Preferably, however, the computing unit 1 can also calculate a third location S3 taking the at least one voxel value W into consideration and S3 proceeding from the second location S2, this third location S3, for example, identifies the region to be examined or the subject to be examined, this third location S3 being optimally close to the first location S1 that identifies the desired intervention location. The connecting line VL that is thus calculated is preferably likewise displayed at the display unit 4. The best intervention point or the best intervention location thus can be predetermined on the basis of organ-related or vessel-related conditions. Within the framework of the invention a number of first and/or second and/or third locations can be prescribed and/or calculated. On the basis of the point S3 calculated by the computing unit 1, a first location can thus be selected via the joystick 7 and/or the input unit 8 and/or the touch screen 21 and/or the voice input unit 22, and the optimization algorithm can be restarted.
When the endoscope and/or laparoscope has an imaging modality allocated to it, then not only are optical real time images obtained therefrom, but also the image signals generated by the imaging modality can be displayed. The employment of an ultrasound modality is especially advantageous here.
Within the framework of the invention, the computing unit 1 can have a data memory 18 allocated to it wherein subject-related data that can be displayed at the display units 4 and/or 11 are stored. So that the treating person or the examining person can also receive information about the physiological condition of the subject 2 to be examined, a physiological signal acquisition system 19 can be provided. In particular, ECG signals, respiration signals, blood pressure signals, temperature signals, etc., as the physiological signals, can be displayed as data or can be optically displayed at the display unit 4 or the further display unit 11 via the computing unit 1.
Although modifications and changes may be suggested by those skilled in the art, it is the intention of the inventors to embody within the patent warranted hereon all changes and modifications as reasonably and properly come within the scope of their contribution to the art.
Patent | Priority | Assignee | Title |
10042511, | Mar 03 2008 | The United States of America, as represented by the Secretary of the Navy; THE USA AS REPRESENTED BY THE SECRETARY OF THE NAVY | Graphical user control for multidimensional datasets |
10255672, | Feb 24 2012 | Toshiba Medical Systems Corporation | Medical apparatus and X-ray CT system |
10864627, | Jun 12 2014 | Wonder Workshop, Inc. | System and method for facilitating program sharing |
11478662, | Apr 05 2017 | MIDCAP FUNDING IV TRUST, AS SUCCESSOR TO EXISTING ADMINISTRATIVE AGENT | Sequential monoscopic tracking |
11617503, | Dec 12 2018 | Voxel Rad, Ltd. | Systems and methods for treating cancer using brachytherapy |
12053883, | Jun 12 2014 | Wonder Workshop, Inc. | System and method for reinforcing programming education through robotic feedback |
12137882, | Dec 12 2018 | Voxel Rad, Ltd. | Systems and methods for treating cancer using brachytherapy |
6445964, | Aug 04 1997 | NORTH SOUTH HOLDINGS INC | Virtual reality simulation-based training of telekinegenesis system for training sequential kinematic behavior of automated kinematic machine |
6529769, | Mar 08 2001 | Bae Systems Information and Electronic Systems Integration INC | Apparatus for performing hyperspectral endoscopy |
6616701, | May 23 1998 | EOLAS TECHNOLOGIES, INC | Method and apparatus for identifying features of multidimensional image data in hypermedia systems |
6892090, | Aug 19 2002 | Surgical Navigation Technologies, Inc. | Method and apparatus for virtual endoscopy |
6947039, | May 11 2001 | Koninklijke Philips Electronics N V | Method, system and computer program for producing a medical report |
7239330, | Mar 27 2001 | Siemens Medical Solutions USA, Inc | Augmented reality guided instrument positioning with guiding graphics |
7477232, | Aug 28 2001 | VOLUME INTERACTIONS PTE LTD | Methods and systems for interaction with three-dimensional computer models |
7489810, | Jun 06 2003 | GE MEDICAL SYSTEMS INFORMATION TECHNOLOGIES, INC | Method and system for linking location information between software applications for viewing diagnostic medical images |
8031838, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8041008, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8047714, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8083406, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8086299, | Mar 16 1999 | MIDCAP FUNDING IV TRUST, AS SUCCESSOR TO EXISTING ADMINISTRATIVE AGENT | Frameless radiosurgery treatment system and method |
8111809, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8116429, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8130904, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8249218, | Jan 29 2009 | GEARBOX, LLC | Diagnostic delivery service |
8634898, | Mar 16 1999 | MIDCAP FUNDING IV TRUST, AS SUCCESSOR TO EXISTING ADMINISTRATIVE AGENT | Frameless radiosurgery treatment system and method |
8734431, | Jun 15 2006 | YANCHERS INC | Remote control system |
8758263, | Oct 31 2009 | VOXEL RAD, LTD | Systems and methods for frameless image-guided biopsy and therapeutic intervention |
8793619, | Mar 03 2008 | The United States of America, as represented by the Secretary of the Navy | Graphical user control for multidimensional datasets |
9530238, | Nov 30 2011 | FUJIFILM Corporation | Image processing apparatus, method and program utilizing an opacity curve for endoscopic images |
9649168, | Oct 31 2009 | Voxel Rad, Ltd. | Systems and methods for frameless image-guided biopsy and therapeutic intervention |
Patent | Priority | Assignee | Title |
4984157, | Sep 21 1988 | General Electric Company | System and method for displaying oblique planar cross sections of a solid body using tri-linear interpolation to determine pixel position dataes |
5371778, | Nov 29 1991 | Picker International, Inc. | Concurrent display and adjustment of 3D projection, coronal slice, sagittal slice, and transverse slice images |
5494034, | May 27 1987 | Georg, Schlondorff | Process and device for the reproducible optical representation of a surgical operation |
5623586, | May 25 1991 | Method and device for knowledge based representation and display of three dimensional objects | |
5715836, | Feb 16 1993 | Brainlab AG | Method and apparatus for planning and monitoring a surgical operation |
5734384, | Nov 29 1991 | Picker International, Inc. | Cross-referenced sectioning and reprojection of diagnostic image volumes |
5859891, | Mar 07 1997 | CMSI HOLDINGS CORP ; IMPAC MEDICAL SYSTEMS, INC | Autosegmentation/autocontouring system and method for use with three-dimensional radiation therapy treatment planning |
5967982, | Dec 09 1997 | The Cleveland Clinic Foundation | Non-invasive spine and bone registration for frameless stereotaxy |
6064904, | Nov 28 1997 | PICKER INTERNATIONAL, INC | Frameless stereotactic CT scanner with virtual needle display for planning image guided interventional procedures |
WO9107726, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 24 1999 | GUENDEL, LUTZ | Siemens Aktiengesellschaft | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010241 | /0784 | |
Aug 31 1999 | WESSELS, GERD | Siemens Aktiengesellschaft | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010241 | /0784 | |
Sep 10 1999 | Siemens Aktiengesellschaft | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Sep 19 2005 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Dec 07 2009 | REM: Maintenance Fee Reminder Mailed. |
Apr 30 2010 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Apr 30 2005 | 4 years fee payment window open |
Oct 30 2005 | 6 months grace period start (w surcharge) |
Apr 30 2006 | patent expiry (for year 4) |
Apr 30 2008 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 30 2009 | 8 years fee payment window open |
Oct 30 2009 | 6 months grace period start (w surcharge) |
Apr 30 2010 | patent expiry (for year 8) |
Apr 30 2012 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 30 2013 | 12 years fee payment window open |
Oct 30 2013 | 6 months grace period start (w surcharge) |
Apr 30 2014 | patent expiry (for year 12) |
Apr 30 2016 | 2 years to revive unintentionally abandoned end. (for year 12) |