A ct scanner (A) non-invasively examines a volumetric region of a subject and generates volumetric image data indicative thereof. An object memory (B) stores the data values corresponding to each voxel of the volume region. An affine transform algorithm (60) operates on the visible faces (24, 26, 28) of the volumetric region to translate the faces from object space to projections of the faces onto a viewing plane in image space. An operator control console (E) includes operator controls for selecting an angular orientation of a projection image of the volumetric region relative to a viewing plane, i.e. a plane of the video display (20). A cursor positioning trackball (90) inputs i- and j-coordinate locations in image space which are converted (92) into a cursor crosshair display (30) on the projection image (22). A depth dimension k between the viewing plane and the volumetric region in a viewing direction perpendicular to the viewing plane is determined (74). The (i,j,k) image space location of the cursor is operated upon by the reverse of the selected transform to identify a corresponding (x,y,z) cursor coordinate in object space. The cursor coordinate in object space is translated (100, 102,104) into corresponding addresses of the object memory for transverse, coronal, and sagittal planes (10, 12, 14) through the volumetric region.
|
15. A method of concurrently displaying a projection image of a volumetric region and at least two intersecting planes through the volumetric region, the method comprising:
displaying the projection image on a portion of a two-dimensional display means; displaying a cursor at a selected location on the projection image; transforming the selected cursor location into a corresponding cursor coordinate of the volumetric region; defining a first plane through the volumetric region which intersects the cursor coordinate; defining a second plane through the volumetric region which intersects the first plane and the cursor coordinate; generating a display of data values corresponding to the first plane in a second portion of the two-dimensional display means; generating a display of the data values corresponding to the second plane in a third region of the display means.
11. In an image display system which includes an object memory for storing data values representing voxels of a three-dimensional volumetric region, a transform means for transforming voxel coordinates of the volumetric region which define polygonal surfaces into transformed polygonal surfaces on a viewing plane, which transformed polygonal surfaces represent projections of the volumetric region polygonal surfaces onto the viewing plane and for reversely transforming locations on the viewing plane into corresponding voxel coordinates in the volumetric region, a two-dimensional display means for generating a two-dimensional human-readable display, the human-readable display including a two-dimensional array of pixels, the transforming means reversely transforming locations of the pixels on the view plane into corresponding pixel coordinates in the volumetric region, and an image processor means for converting the data values corresponding to the reversely transformed pixel coordinates into image values for display at the corresponding pixels of the two-dimensional display means, the improvement comprising:
a cursor positioning means for selecting a location on the two-channel display at which a cursor is displayed, the cursor positioning means being operatively connected with the image processor means for causing the cursor to be displayed at the selected location on the two-dimensional display and being operatively connected with the transform means for reversely transforming the selected cursor location to a corresponding cursor coordinate in the volumetric region; a plane defining means operatively connected with the transform means for defining at least two planes through the volumetric region, which planes intersect at the cursor coordinate, the data values corresponding to the defined planes being supplied to the image processor means to be converted into the image values which are displayed on the display means.
2. An image display system comprising:
an object memory means for storing data values from a three-dimensional image data source representing voxels of a three-dimensional volumetric region; a transform means for transforming polygonal surfaces of the volumetric region into transformed polygonal surfaces on a viewing plane which transformed polygonal surfaces represent projections of the volumetric region polygonal surfaces on the viewing plane and for reversely transforming locations on the viewing plane into corresponding coordinates in the volumetric region; a two-dimensional display means for generating a two-dimensional human-readable display corresponding to the viewing plane, the human-readable display including a two-dimensional array of pixels, the transforming means reversely transforming the locations of the pixels into corresponding image pixel coordinates of the volumetric region; an image processor means for converting the data values corresponding to the reversely transformed image pixel coordinates into image values displayed at the corresponding pixels of the two-dimensional display means; a cursor positioning means for selecting a location on the two-dimensional display at which a cursor is displayed, the cursor positioning means being operatively connected with the image processor means for causing the cursor to be displayed at the selected location and with the transform means for reversely transforming the selected cursor location to a corresponding cursor coordinate in the volumetric region; a plane defining means operatively connected with the transform means for defining at least two planes through the volumetric region which intersect at the reversely transformed cursor coordinate, data values corresponding to the defined planes being supplied to the image processor means which converts the data values corresponding to the defined planes into image values which are displayed on the two-dimensional display means, whereby human-readable images of a projection view of the volumetric region and at least two intersecting planes in the volumetric region are displayed concurrently with the human-readable images of the planes changing as the cursor positioning means moves the cursor.
1. A ct scanner system comprising:
a source of radiation for irradiating an examination region from a plurality of directions; a radiation detection means disposed across the examination region from the radiation source for receiving radiation that has traversed the examination region; an examined object support means for supporting and moving an object axially through the examination region such that a volumetric region of the object is examined; a reconstruction means for reconstructing data values representing voxels of the volumetric region; an object memory means for storing the data values from the reconstruction means; a transform means for transforming polygonal surfaces of the volumetric region into transformed polygonal surfaces on a viewing plane which transformed polygonal surfaces represent projections of the volumetric region polygonal surfaces on the viewing plane and for reversely transforming locations on the viewing plane into corresponding coordinates in the volumetric region; a two-dimensional display means for generating a two-dimensional human-readable display corresponding to the viewing plane, the human-readable display including a two-dimensional array of pixels, the transforming means reversely transforming the locations of the pixels into corresponding coordinates of the volumetric region; an image processor means for converting the data values corresponding to the reversely transformed pixel coordinates into image values displayed at the corresponding pixels of the two-dimensional display means; a cursor positioning means for selecting a location on the two-dimensional display at which a cursor is displayed, the cursor positioning means being operatively connected with the image processor means for causing the cursor to be displayed at the selected cursor location and with the transform means for reversely transforming the selected cursor location to a corresponding cursor coordinate in the volumetric region; a plane defining means operatively connected with the transform means for defining at least two of transverse, coronal, and sagittal planes through the volumetric region which intersect at the reversely transformed cursor coordinate, the data values corresponding to the defined planes being supplied to the image processor means which converts the data values corresponding to the defined planes into image values which are displayed on the two-dimensional display means, whereby human-readable images of a projection view of the volumetric region and at least two of intersecting transverse, coronal, and sagittal planes are displayed concurrently, the human-readable images of the planes changing substantially in real time as the cursor positioning means moves the cursor.
3. The system as set forth in
4. The system as set forth in
5. The system as set forth in
6. The system as set forth in
7. The system as set forth in
an axial position indicating means for indicating positions along an axial direction of the ct scanner; and, wherein the cursor positioning means includes means for selecting horizontal and vertical positions along the two-dimensional display, the axial position indicating means and the cursor positioning means being connected with the transform means such that the indicated axial position and the indicated horizontal and vertical positions are reverse transformed into the cursor coordinate supplied to the plane defining means.
8. The system as set forth in
9. The system as set forth in
10. The system as set forth in
12. In the system set forth in
an axial position indicating means for selecting positions along a first axis, which first axis extends away from the viewing plane; and, wherein the cursor positioning means includes a means for selecting horizontal and vertical positions along the two-dimensional display, the axial position indicating means and the cursor positioning means being connected with the transform means such that the indicated axial position and the indicated horizontal and vertical display positions are reversely transformed into the cursor coordinate.
13. In the system as set forth in
a transverse plane defining means for defining a transverse plane through the cursor coordinate; a coronal plane defining means for defining a coronal plane orthogonal to the transverse plane through the cursor coordinate; and, a sagittal plane defining means for defining a sagittal plane orthogonal to the transverse and coronal planes through the cursor coordinate.
14. In the system as set forth in
16. The method as set forth in
17. The method as set forth in
18. The method as set forth in
19. The method as set forth in
20. The method as set forth in
|
The present invention pertains to the image display art. It finds particular application in conjunction with the display of CT medical diagnostic images on video monitors and will be described with particular reference thereto. However, it is to be appreciated that the invention is also applicable to medical diagnostic images from magnetic resonance, nuclear, and other imaging modalities, to quality assurance and other three-dimensional, non-medical images, and the like. The invention is also applicable to hard copy displays, film image displays, and other display formats.
Heretofore, CT scanners have irradiated a planar region of a subject from various angles and detected the intensity of radiation passing therethrough. From the angle and radiation intensity information, two-dimensional image representations of the plane were reconstructed. A typical image representation included a 512×512 pixel array, although coarser and finer arrays are also known.
For three-dimensional imaging, the patient was moved along a longitudinal axis of the CT scanner either continuously for spiral scanning or incrementally, to generate a multiplicity of slices. The image data was reconstructed, extrapolating or interpolating as necessary, to generate CT numbers corresponding to each of a three-dimensional array of voxels. For simplicity of illustration, each of the CT numbers can be conceptualized as being addressable by its coordinate location along three orthogonal axes, e.g. x, y, and z-axes of the examined volume.
Typically, the volume data was displayed on the planar surface of a video monitor. Various planar representations of the volume data are now commonly available. Most commonly, the examined volume was a six sided prism with square or rectangular faces. The operator could select a display depicting any one of the six faces of the prism or any one of the slices through an interior of the prism along one of the (x,y), (x,z) or (y,z) planes. Some display formats also permitted oblique planes to be selected. Display formats were also available which permitted two or three sides of the prism to be displayed concurrently on a two-dimensional (i,j) image plane with appropriate visual cues to give the impression of a perspective view in three dimensions. That is, the visible faces were foreshortened (or extended) and transformed from rectangles to parallelograms by a sine or cosine value of an angle by which the viewing direction was changed. In this manner, each face of the prism was transformed into its projection along the viewing direction onto the viewing plane. This gives the faces the appearance of extending either parallel to the viewing plane or video monitor screen or extending away from the screen at an oblique angle. Some routines added shading to the view to give further visual cues of depth.
More specifically, the operator could typically cause a selected surface, such as a transverse (x,y) plane on the face (z=0) of the examined volume to be displayed. The operator could then cause a selected number of transverse planar slices to be peeled away or deleted by indexing along the z-axis (z=1,2,3, . . . ,n) to view the nth interior transverse planes. The operator could then position the cursor on the (x,y) or transverse plane to select a coronal or (x,z) plane. The selected coronal plane would then be displayed. The operator would then position the cursor on the displayed coronal plane to select a sagittal or (y,z) plane. Prior art medical image workstations commonly permitted the transverse, coronal, or sagittal planes or views to be displayed concurrently on the same screen. Some also permitted the three-dimensional projection image to be displayed concurrently as well.
One of the disadvantages of these prior art systems is that they did not permit simultaneous, interactive adjustment of the selected transverse, coronal, and sagittal planes. These prior art adjustments were commonly based on a two-dimensional reference plane which was always co-planar with the transverse, sagittal, or coronal planes, therefore restricting the sectioning cursor to two-dimensional movements. In the display format in which all three planes were displayed concurrently, the operator moved the cursor to one of the views, which then became the "active" view. By moving the cursor on the active view, the next planar slice could be reselected. By moving the cursor to the readjusted planar slice, the next slice could be readjusted. Thus, readjusting the displayed transverse, coronal, and sagittal views was sequential and, therefore, relatively slow and time consuming.
The present invention contemplates a new and improved method and apparatus for displaying images which permits concurrent, real-time readjustment of the transverse, coronal, and sagittal view displays by using a rotatable 3D object (or volume) and its projection view as a three-dimensional reference surface which allows the sectioning cursor to move in three dimensions.
In accordance with one aspect of the present invention, a volume object memory means is provided for holding data values indicative of each voxel of a volumetric region of the object. An affine transform means rotates, scales, and translates points, lines, and surfaces of the volumetric region (object space) into transformed points, lines, and surfaces of a 3D projection view when displayed on the pixels of a two-dimensional image plane or video display (image space). The transform means also supplies a reverse of the selected transform to transform the display pixels into corresponding locations of the object volumetric region. A video processor generates a video display of the data values that correspond to the reverse transformed locations in the volumetric region. An operator uses a cursor control means to move a cursor on the video display. The transform means also reversely transforms coordinates of the cursor from the image plane to a corresponding location in the volumetric region. A plane defining means defines orthogonal planes, preferably, transverse, coronal, and sagittal planes, which intersect the reversely transformed location in the volumetric region. The video processor means receives data values from the object memory lying along each of the planes and converts them into a corresponding video image. Preferably, the video processor converts the two-dimensional projection image representation and the planar images into images which are displayed concurrently in a common video display.
In accordance with another aspect of the present invention, a third image space coordinate is determined in accordance with a relative distance along the viewing direction from the screen pixel at the cursor to a point of intersection with a displayed voxel of the object.
One advantage of the present invention is that the relationship between the volume projection view and the transverse, coronal, and sagittal section (re-sliced) planes is maintained when the volume view is rotated for better visualization. These planes intersect at the cursor in both object and image space. The reverse transform between these spaces enables the planes to be updated correctly in object space regardless of the rotating (or view direction or orientation) of the volume projection view.
Another advantage of the present invention is that it permits interactive and simultaneous adjustment of the transverse, coronal, and sagittal planes.
Another advantage of the present invention is that it assists the operator in relating the position of the displayed transverse, sagittal, and coronal planes with their locations through a perspective type view of the volume.
Another advantage of the present invention is that it permits the operator to select the intersection point of the transverse, coronal, and sagittal planes in three dimensions in object space by using a cursor on a two-dimensional screen.
Still further advantages of the present invention will become apparent to those of ordinary skill in the art upon reading and understanding the following detailed description of the preferred embodiments.
The invention may take form in various components and arrangements of components, and in various steps and arrangements of steps. The drawings are only for purposes of illustrating a preferred embodiment and are not to be construed as limiting the invention.
FIG. 1 is a diagrammatic illustration of an image data display system in accordance with the present invention;
FIG. 2 is a diagrammatic illustration of a preferred video display generated by the present invention;
FIG. 2A illustrates a transverse plane through the volumetric region;
FIG. 2B illustrates a coronal plane through the volumetric region;
FIG. 2C illustrates a sagittal plane through the volumetric region;
FIG. 3 is a diagrammatic explanation of the transverse, sagittal, and coronal planes relative to a human subject;
FIG. 4 is analogous to FIG. 2 but illustrates a projection view that has at least one obliquely cut surface;
FIG. 5 is analogous to FIG. 4 but with the perspective view rotated to another viewing orientation.
With reference to FIG. 1, a diagnostic imaging device A non-invasively examines a polyhedral volumetric region of a subject and generates a data value indicative of each voxel within the volumetric region. The data values corresponding to voxels of the polyhedron are stored in a three-dimensional object memory means B. The shape and size of the volumetric region is generally defined by the diagnostic imaging device. In the embodiment illustrated in FIG. 2, the region is illustrated as a rectangular prism, i.e. a six-sided volume having rectangular or square orthogonal faces. With continuing reference to FIG. 2 and further reference to FIG. 3, the volumetric region is defined by x, y, and z-coordinates which are defined in terms of a transverse plane 10, coronal plane 12, and sagittal plane 14 of a patient or other examined object. For each voxel within the polyhedral examined volumetric region, the imaging device A generates a data value, e.g. a CT number, which, for simplicity of illustration, is retrievable from the object memory B by addressing the object memory with the (x,y,z) coordinates of the voxel. A data processing system C processes the three-dimensional object data to generate a video display D in accordance with instructions input by the operator on an operator control console or system E.
With reference to FIG. 2, the video display D includes a video display screen 20 having a plurality of, e.g. four, view ports. Each view port displays an independently changeable video image. In the preferred embodiment, a first view port 22 displays a projection image depicting a projection of the imaged volume onto the video screen or viewing plane 20. The video screen or viewing plane includes a two-dimensional array of pixels defined by coordinates (i,j). A third coordinate k is defined in a direction orthogonal to the i, j-coordinates of the viewing plane. Faces 24, 26, 28 of the 3D projection image are "distorted" to give visual cues indicative of the depth or distance along the k-axis between the viewing screen and each point on the surface. The rectangular faces in the illustrated projection image are displayed as parallelograms with the angles at the corners changed from orthogonal in proportion to the relative angular orientation or rotation of the viewing plane relative to the examined object region. The dimensions of the parallelograms are likewise foreshortened in accordance with the angular orientation or rotation. Note that if a face is orthogonal to the viewing plane, it is displayed full size with 90° corners. However, as the faces appear to become obliquely oriented toward the viewing screen, the faces are foreshortened and the change in the angles at the corners of the parallelograms becomes more pronounced.
With reference to FIGS. 4 and 5, the operator may conveniently position or rotate the 3D projection image with a selected apparent orientation when viewed from the viewing plane; conversely, the viewer may re-orient or rotate the viewing plane around the polyhedral imaged volume. The volume may be rotated about a selected axis to bring previously hidden faces into view.
With continuing reference to FIG. 2 and further reference to FIG. 3, the operator positions a cursor 30 at a selectable location on the first view port or portion 22 of the video display D. A second view port 32 displays the data along the transverse plane 10 through the position of the cursor. In the coordinate system of FIG. 2, the transverse plane is also the (x,y) plane. In a CT scanner in which a human patient is disposed in a prone position, the transverse plane is also known as an axial plane. Because the x, y, and z-coordinates in object space are fixed, the displayed (x,y) plane is selected by adjusting the selected distance along the z-axis. A third view port 34 displays an image of the coronal plane 12, i.e. the (x,z) plane. A fourth view port 36 displays the (y,z) or sagittal plane 14 through the imaged volume which intersects the (x,y,z) position of the cursor 30.
To index through the available coronal planes, the operator moves the cursor 30 across face 24 along track 38c. By moving the cursor along track 38s, the sagittal plane is re-positioned left and right in the illustration of FIG. 2C. To index the transverse planes with the coronal and sagittal planes, the operator uses either a transverse slice selection means other than a cursor or tracks along one of paths 38t and 38t'. The examined volumetric region illustrated at FIG. 2 is through the pelvic region of the patient. The pelvic bone 40 and lumbar vertebrae 42 are visible on the surface of the projection image of the first view port 22. The operator's view of the pelvic bone, lumbar vertebrae, and other associated tissue is adjusted by moving the cursor 30 until the transverse, coronal, and sagittal images are optimized for the selected diagnostic procedure.
Of course, the examined volume may not coincide precisely with the region that the operator wants to examine. Other tissues and structures such as air and the patient couch, are commonly examined and imaged along with the patient. An editing means 44 enables the operator to make an effective removal of unwanted voxels from the examination region. Although removing a single selected voxel is conceptually simplest, the operator more typically removes or edits larger groups of voxels. As is conventional in the art, the operator may define cutting planes, either parallel to one of the transverse, coronal, or sagittal planes, or oblique cutting planes. The operator may also define curved cutting surfaces. A volumetric region edited into a polygon with at least one oblique surface is illustrated in FIGS. 4 and 5. Rather than editing voxels based on spatial location, the operator can also edit voxels based on other criteria. For example, air, soft tissue, bone, and other types of imaged subject matter have CT numbers in distinct ranges. The operator can delete all voxels with CT numbers corresponding to air, for example. As another example, the operator may choose to edit all voxels except those with CT numbers corresponding to bone. This provides a skeletal display in the projection image. As yet another option, the operator may perform a separate editing for the projection image and the three orthogonal slice images. For example, the projection image may be a tissue specific depth image with shading and the three orthogonal images can be interpolated CT number images. As another example, the projection image can be edited for tissue type to "peel away" selected tissue types, thereby providing a new surface for the cursor to traverse. This can be achieved by duplicating the object memory and accessing the memory holding data edited with one editing function for the projection image and accessing the memory edited with the other editing function to display the orthogonal slices. In this manner, the operator can display, for example, a projection view of a section of the patient's skeleton to facilitate accurate placement of the cursor while viewing images of all tissue in the orthogonal slices through the cursor position.
With reference to FIGS. 4 and 5, in many instances, the displayed projection image of the volume has one or more oblique surfaces 46. As the cursor 30 moves along an oblique surface, such as along track 48, all three of the transverse, coronal, and sagittal planes are indexed concurrently. Even after the transverse, coronal, and sagittal views are selected, the operator can rotate the viewing plane or imaged object, such as between the positions of FIGS. 4 and 5, without affecting the orientation or other aspects of the display of the transverse coronal and sagittal planes. The rotation can expose surfaces that were not previously visible.
With reference again to FIG. 1, the non-invasive examination means A, in the illustrated embodiment, is a CT scanner. However, other sources of three dimensional image data both outside the medical imaging field and in the medical imaging field, such as magnetic resonance imagers, are contemplated. The non-invasive medical diagnostic apparatus A includes an examination region 50 for receiving the subject supported on a patient couch or support 52. An irradiating means 54, such as an x-ray tube, magnets, or radio frequency coils, irradiates the patient. A radiant energy receiving means 56, such as radiation detectors, radio frequency receiving coils, or the like, receive medical diagnostically encoded radiant energy. In the illustrated CT scanner example, the source of radiant energy is an x-ray tube which generates a fan-shaped beam of x-rays. The fan-shaped beam of x-rays passes through the subject in the examination region 50 impinging upon a ring of x-ray detectors of the radiant energy detection means 56. The x-ray tube is mounted for rotation by a motor or other rotating means about the examination region such that the patient is irradiated from a multiplicity of directions. The radiation detectors are positioned either in a stationary ring surrounding the examination ring or in an arc which rotates with the x-ray tube to receive the radiation that has traversed the patient.
An image reconstruction means 58 reconstructs an image representation from the received radiation. For example, the image reconstruction means may reconstruct a 512×512 array of data values, each data value being representative of a radiation transmissive property of a corresponding voxel of the one plane or slice of the volumetric region. The patient couch is indexed axially through the examination region between scans to generate a plurality of slices of image data. Optionally, the patient couch may be translated continuously such that the x-ray beam passes through the patient along a spiral path. If spiral data is generated, a conventional, spiral data reconstruction means is utilized to convert the spiral data into data values corresponding to each of a three-dimensional orthogonal array of voxels, e.g. an x, y, z array where x, y, and z are the coordinate axes of object space. Object space is the (x,y,z) coordinate system of the patient in the scanner; whereas, image space is the (i,j, k) coordinate system of the projection image presented in the first port 22.
The data processing system C includes transform means 60 which translates, rotates, and scales coordinates, lines, curves, and surfaces from object space to image space and reversely transforms locations, lines, curves, and surfaces from image space to object space. More specifically, the affine transform is a matrix which translates coordinates or vectors x, y, z in object space to corresponding coordinates or vectors i, j, k in image space, i.e.: ##EQU1## Conversely, the reverse of the affine transform matrix converts coordinates or vectors in image space to corresponding coordinates or vectors in object space, i.e.: ##EQU2## The k-coordinate of the projection image is uniquely defined by the i, j-coordinate. For example, the planes of the polyhedral volumetric region are mathematically defined in the process of editing the data or otherwise preparing the data for display. Accordingly, the k value can be retrieved from a look-up table or otherwise uniquely calculated from this a priori information. When the viewing angle is changed, the values of the transform matrix are modified in accordance with trigonometric functions of the angle of rotation.
The operator control means E includes a mouse, trackball, or other angular orientation input means 62θx, 62θy, and 6274 z for inputting a degree of rotation of the viewing angle about the x, y, and z-axes to rotate the 3D projection image as illustrated by way of example in FIGS. 2--4. Viewing angle buffers 64θx, 64θy, and 64θz store the selected viewing angle. A one-dimensional joystick or other scale input means 66 controls enlargement and reduction of the viewed 3D volume image. A scale or magnification buffer 68 stores the selected scale factor. Optionally, other controls may be provided for translating the viewed 3D volume projection image.
The affine transform means 60 adds the indicated x, y, and z-translation factors, multiplies the length and angle of the polyhedral faces from the volume space by sine and cosine values of the indicated rotation angles, and multiplies the dimensions by the scale factor. An image space memory means 70 stores the transformed face polygons and a grid indicative of the (i,j) pixel locations on the video display D. A data retrieval means 72 identifies each pixel location which falls within one of the polygonal faces and determines its location relative to that polygon.
A depth from the viewing plane determining means 74 determines a depth or distance k in the viewing direction from the viewing plane to a point of intersection with a viewed voxel of the imaged volume. More specifically, the depth determining means 74 determines the distance from the cursor pixel of the viewing plane to a point of intersection with the underlying face. The depth may be determined, for example, from a look-up table addressed by (i,j), by the conventional ray tracing technique in which a length of a ray projected in the viewing direction from the corresponding pixel of the viewing plane to a point of intersection with the object is determined, or the like. It is to be appreciated, that the point of intersection need not be on the surface 26. As indicated above, some voxels of the object may be given a value which renders them invisible to the viewer. For example, only data values within object space which have a CT number corresponding to a selected tissue type, such as bone, may be displayed in a surface rendered image based on a depth image. In this 3D appliation, 3D tissue surfaces are created by allowing the operator to select a tissue type such as bone, and then other tissue types are segmented out. In this 3D technique, the depth from the screen to the surface is commonly looked-up from a visible surface memory buffer, or is determined with a suitable ray tracing technique. Data values corresponding to air and other tissue types are set to zero, or another value which indicates that they are not displayed. As the cursor 30 moves across the patient's face, such as across the patient's nose, the depth from the viewing plane to the viewable voxels changes, causing a corresponding change in the location of the transverse plane. If the cursor moves obliquely across face 24 as well, all three displayed planes change concurrently.
The data retrieval means 72 accesses the transform means 60 and the depth means 74 to cause the image space pixel locations to be transformed with the reverse of the transform indicated by the buffers 64, 68. The reverse transform of the (i,j,k) pixel location provides a corresponding (x,y,z) coordinate in object space. A memory access means 76 uses the object space coordinates to retrieve the corresponding data values from the object memory B.
Although reversely transformed coordinates of the 3D projection image can fall directly on voxels of object space, the coordinates in many instances will fall in between. To this end, an interpolating means 80 interpolates the data values corresponding to the two, four, or eight closest voxels to the reversely transformed coordinates, in inverse proportion to the relative proximity.
The retrieved, interpolated values from the object memory B are converted by a video processor 82 into a video display on a video display means D. If the video processor can generate images with more than one pixel format, it is connected with the image space memory 70 for supplying an indication of the selected pixel grid. Optionally, a video memory 84 may be provided. The video memory has a first portion 86 corresponding to the first video port into which the data for the projection image is loaded. The video processor 82 then converts the data from the video memory into a video signal to drive the video display.
The operator control panel E further includes a cursor positioning means 90, such as a mouse or trackball for indicating the (i,j) location of the cursor relative to the projection image. A cursor image control means 92 is connected between the cursor positioning means and the video processor 82 to cause the cursor 30, such as a crosshair, to be displayed at the selected (i,j) coordinates indicated by the cursor positioning means. Optionally, the transverse (or other) slice may be selected by a z- (or other) axis control 94 including associated circuitry or buffers 96. As the z-control is indexed, slices on the front face 24 of the displayed three-dimensional object are "peeled away". That is, the displayed front face is removed and the next plane down becomes the frontmost face of the volumetric region. This process is repeated until a selected transverse plane is reached. The cursor control 90 increments the i and j-coordinates of the cursor crosshair, causing the crosshair to be shifted vertically and horizontally across the video display. The k-coordinate is selected either from the depth measuring means 74 or the z-control 94.
The i, j, and k-coordinate corresponding to the cursor is conveyed to the transform means 60 which performs a reverse of the selected transform on cursor location to transform it from image space to the corresponding x, y, z-coordinate of object space.
A transverse or (x,y) plane defining means 100 converts the designated (x,y,z) coordinate into an identification of the transverse or (x,y) plane. As illustrated in FIG. 2A, the transverse or (x,y) plane has a fixed orientation in object space. Only the z-location in object space is necessary to identify the (x,y) plane. A coronal or (x,z) plane defining means 102 defines the addresses of the selected coronal plane in the object memory B. Again, as illustrated in FIG. 2B, because the orientation of the coronal cursor coordinate in object space is fixed, the position of the plane along the y-axis determines the coronal plane. Analogously, a sagittal or (y,z) plane defining means 104 converts the received cursor coordinates into the appropriate addresses in the object memory for the sagittal plane. The CT or other data values at the addresses for the transverse, coronal, and sagittal planes are fed to corresponding subregions 110, 112, 114, respectively, of the video memory 84. In this manner, by moving the (i,j) cursor position selector 90 or the z-input control 94, one or more of the planes is redefined and the new data values are loaded immediately into the video memory and supplied to the video processor.
In the preferred embodiment, the reversely transformed cursor coordinates are also supplied to a second cursor image control means 116 which causes the video processor to generate a crosshair or other cursor indicator 118 on each of the transverse, coronal, and sagittal images at a display location corresponding to the cursor coordinate, hence corresponding to the cursor in the projection image in the first video port 22.
The invention has been described with reference to the preferred embodiment. Obviously, modifications and alterations will occur to others upon reading and understanding the preceding detailed description. It is intended that the invention be construed as including all such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
Mattson, Rodney A., Yanof, Jeffrey H., Patel, Paula I.
Patent | Priority | Assignee | Title |
10054932, | Mar 11 2013 | AUTODESK, Inc | Techniques for two-way slicing of a 3D model for manufacturing |
10070801, | Jul 10 2008 | Covidien LP | Integrated multi-functional endoscopic tool |
10096126, | Jun 03 2008 | Covidien LP | Feature-based registration method |
10154798, | Apr 08 2009 | Covidien LP | Locatable catheter |
10210667, | Feb 08 2013 | EWOOSOFT CO , LTD | Displaying 3D image with a plurality of surface images at depths of interest |
10255672, | Feb 24 2012 | Toshiba Medical Systems Corporation | Medical apparatus and X-ray CT system |
10285623, | Jun 06 2008 | Covidien LP | Hybrid registration method |
10321803, | May 01 2005 | Covidien LP | System and method for image-based alignment of an endoscope |
10383509, | Sep 15 2003 | Covidien LP | System of accessories for use with bronchoscopes |
10390686, | Sep 27 2007 | Covidien LP | Bronchoscope adapter and method |
10398395, | Apr 14 2015 | Canon Medical Systems Corporation | Medical image diagnostic apparatus |
10418705, | Oct 28 2016 | Covidien LP | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
10426555, | Jun 03 2015 | Covidien LP | Medical instrument with sensor for use in a system and method for electromagnetic navigation |
10430981, | Oct 27 2010 | Koninklijke Philips Electronics N V | Image artifact identification and mitigation |
10446931, | Oct 28 2016 | Covidien LP | Electromagnetic navigation antenna assembly and electromagnetic navigation system including the same |
10466854, | Jun 10 2016 | HEXAGON TECHNOLOGY CENTER GMBH | Systems and methods for accessing visually obscured elements of a three-dimensional model |
10478092, | Jun 06 2008 | Covidien LP | Hybrid registration method |
10478254, | May 16 2016 | Covidien LP | System and method to access lung tissue |
10514451, | Jul 15 2014 | Garmin Switzerland GmbH | Marine sonar display device with three-dimensional views |
10517505, | Oct 28 2016 | Covidien LP | Systems, methods, and computer-readable media for optimizing an electromagnetic navigation system |
10580325, | Mar 24 2010 | SIMBIONIX LTD | System and method for performing a computerized simulation of a medical procedure |
10582834, | Jun 15 2010 | RIOJA LLC; Covidien LP | Locatable expandable working channel and method |
10597178, | Jan 18 2006 | Medtronic Navigation, Inc. | Method and apparatus for providing a container to a sterile environment |
10615500, | Oct 28 2016 | Covidien LP | System and method for designing electromagnetic navigation antenna assemblies |
10638952, | Oct 28 2016 | Covidien LP | Methods, systems, and computer-readable media for calibrating an electromagnetic navigation system |
10667887, | Jun 29 2007 | MEDIT CORP | Video-assisted margin marking for dental models |
10674936, | Jun 06 2008 | Covidien LP | Hybrid registration method |
10722311, | Oct 28 2016 | Covidien LP | System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map |
10743748, | Apr 17 2002 | Covidien LP | Endoscope structures and techniques for navigating to a target in branched structure |
10751126, | Oct 28 2016 | Covidien LP | System and method for generating a map for electromagnetic navigation |
10758212, | Dec 03 2011 | Koninklijke Philips N.V. | Automatic depth scrolling and orientation adjustment for semi-automated path planning |
10792106, | Oct 28 2016 | Covidien LP | System for calibrating an electromagnetic navigation system |
10795457, | Dec 28 2006 | D3D TECHNOLOGIES, INC | Interactive 3D cursor |
10898153, | Mar 01 2000 | Medtronic Navigation, Inc. | Multiple cannula image guided tool for image guided procedures |
10912487, | Jul 10 2008 | Covidien LP | Integrated multi-function endoscopic tool |
10936090, | Dec 28 2006 | D3D Technologies, Inc. | Interactive 3D cursor for use in medical imaging |
10942586, | Dec 28 2006 | D3D Technologies, Inc. | Interactive 3D cursor for use in medical imaging |
10952593, | Jun 10 2014 | Covidien LP | Bronchoscope adapter |
10980400, | Sep 27 2007 | Covidien LP | Bronchoscope adapter and method |
10980494, | Oct 20 2014 | The University of North Carolina at Chapel Hill | Systems and related methods for stationary digital chest tomosynthesis (s-DCT) imaging |
11006914, | Oct 28 2015 | Medtronic Navigation, Inc. | Apparatus and method for maintaining image quality while minimizing x-ray dosage of a patient |
11016579, | Dec 28 2006 | D3D Technologies, Inc. | Method and apparatus for 3D viewing of images on a head display unit |
11024080, | Mar 11 2013 | AUTODESK, Inc | Techniques for slicing a 3D model for manufacturing |
11036311, | Dec 28 2006 | D3D Technologies, Inc. | Method and apparatus for 3D viewing of images on a head display unit |
11074702, | Jun 03 2008 | Covidien LP | Feature-based registration method |
11127106, | Jun 28 2019 | Intel Corporation | Runtime flip stability characterization |
11160617, | May 16 2016 | Covidien LP | System and method to access lung tissue |
11219489, | Oct 31 2017 | Covidien LP | Devices and systems for providing sensors in parallel with medical tools |
11228753, | Dec 28 2006 | D3D TECHNOLOGIES, INC | Method and apparatus for performing stereoscopic zooming on a head display unit |
11234611, | Jul 10 2008 | Covidien LP | Integrated multi-functional endoscopic tool |
11241164, | Jul 10 2008 | Covidien LP | Integrated multi-functional endoscopic tool |
11275242, | Dec 28 2006 | TIPPING POINT MEDICAL IMAGES, LLC | Method and apparatus for performing stereoscopic rotation of a volume on a head display unit |
11315307, | Dec 28 2006 | TIPPING POINT MEDICAL IMAGES, LLC | Method and apparatus for performing rotating viewpoints using a head display unit |
11331150, | Oct 28 1999 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
11409341, | Oct 01 2019 | Intel Corporation | Repeating graphics render pattern detection |
11446095, | Dec 24 2019 | BIOSENSE WEBSTER ISRAEL LTD | 2D pathfinder visualization |
11520415, | Dec 28 2006 | D3D Technologies, Inc. | Interactive 3D cursor for use in medical imaging |
11532244, | Sep 17 2020 | SIMBIONIX LTD | System and method for ultrasound simulation |
11596481, | Dec 24 2019 | BIOSENSE WEBSTER ISRAEL LTD | 3D pathfinder visualization |
11672604, | Oct 28 2016 | Covidien LP | System and method for generating a map for electromagnetic navigation |
11684491, | Jan 30 2003 | Medtronic Navigation, Inc. | Method and apparatus for post-operative tuning of a spinal implant |
11707363, | Jan 30 2003 | Medtronic Navigation, Inc. | Method and apparatus for post-operative tuning of a spinal implant |
11759264, | Oct 28 2016 | Covidien LP | System and method for identifying a location and/or an orientation of an electromagnetic sensor based on a map |
11783498, | Jun 03 2008 | Covidien LP | Feature-based registration method |
11786314, | Oct 28 2016 | Covidien LP | System for calibrating an electromagnetic navigation system |
11786317, | May 16 2016 | Covidien LP | System and method to access lung tissue |
11801024, | Oct 28 2015 | Medtronic Navigation, Inc. | Apparatus and method for maintaining image quality while minimizing x-ray dosage of a patient |
11914438, | Oct 01 2019 | Intel Corporation | Repeating graphics render pattern detection |
5644611, | Feb 16 1996 | Axsys Corporation | Method and apparatus for maximizing the number of radiological images displayed on a display screen |
5694142, | Jun 21 1993 | General Electric Company | Interactive digital arrow (d'arrow) three-dimensional (3D) pointing |
5803914, | Apr 15 1993 | KONINKLIJKE PHILIPS ELECTRONICS N V | Method and apparatus for displaying data in a medical imaging system |
5871445, | Apr 26 1993 | ST LOUIS UNIVERSITY | System for indicating the position of a surgical probe within a head on an image of the head |
5891034, | Oct 19 1990 | ST LOUIS UNIVERSITY | System for indicating the position of a surgical probe within a head on an image of the head |
5969725, | Mar 17 1995 | CANON KABUSHILKI KAISHA | Unification of three-dimensional image data having plural surface shapes |
5986662, | Oct 16 1996 | VITAL IMAGES, INC | Advanced diagnostic viewer employing automated protocol selection for volume-rendered imaging |
5999165, | Mar 27 1996 | NEC Corporation | Three-dimensional display system |
6076008, | Apr 26 1993 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
6101234, | Nov 26 1997 | General Electric Company | Apparatus and method for displaying computed tomography fluoroscopy images |
6146390, | Apr 21 1992 | Sofamor Danek Holdings, Inc. | Apparatus and method for photogrammetric surgical localization |
6165181, | Apr 21 1992 | SOFAMOR DANEK HOLDINGS, INC | Apparatus and method for photogrammetric surgical localization |
6167145, | Mar 29 1996 | SURGICAL NAVIGATION TECHNOLOGIEIS, INC | Bone navigation system |
6226418, | Nov 07 1997 | Washington University | Rapid convolution based large deformation image matching via landmark and volume imagery |
6226548, | Sep 24 1997 | Medtronic Navigation, Inc | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
6235038, | Oct 28 1999 | Medtronic Surgical Navigation Technologies | System for translation of electromagnetic and optical localization systems |
6236875, | Oct 07 1994 | SURGICAL NAVIGATION TECHNOLOGIES, INC ; ST LOUIS UNIVERSITY | Surgical navigation systems including reference and localization frames |
6282437, | Aug 12 1998 | NEUTAR L L C | Body-mounted sensing system for stereotactic surgery |
6298262, | Apr 21 1998 | NEUTAR L L C | Instrument guidance for stereotactic surgery |
6347240, | Oct 19 1990 | ST LOUIS UNIVERSITY | System and method for use in displaying images of a body part |
6349273, | Feb 02 1998 | NEC Electronics Corporation | Atomic coordinates generating method |
6351662, | Aug 12 1998 | Neutar, LLC | Movable arm locator for stereotactic surgery |
6369812, | Nov 26 1997 | Philips Medical Systems, (Cleveland), Inc. | Inter-active viewing system for generating virtual endoscopy studies of medical diagnostic data with a continuous sequence of spherical panoramic views and viewing the studies over networks |
6374135, | Oct 19 1990 | SAINT LOUIS UNIVERSITY | System for indicating the position of a surgical probe within a head on an image of the head |
6379302, | Oct 28 1999 | Medtronic Navigation, Inc | Navigation information overlay onto ultrasound imagery |
6380958, | Sep 15 1998 | Siemens Aktiengesellschaft | Medical-technical system |
6381485, | Oct 28 1999 | Medtronic Navigation, Inc | Registration of human anatomy integrated for electromagnetic localization |
6402762, | Oct 28 1999 | Surgical Navigation Technologies, Inc. | System for translation of electromagnetic and optical localization systems |
6408107, | Jul 10 1996 | Washington University | Rapid convolution based large deformation image matching via landmark and volume imagery |
6429884, | Nov 24 1998 | Siemens Healthcare GmbH | Method and apparatus for processing and playback of digital images at a display monitor |
6434415, | Oct 19 1990 | St. Louis University; Surgical Navigation Technologies, Inc. | System for use in displaying images of a body part |
6490467, | Oct 19 1990 | Surgical Navigation Technologies; ST LOUIS UNIVERSITY | Surgical navigation systems including reference and localization frames |
6491702, | Apr 21 1992 | Sofamor Danek Holdings, Inc. | Apparatus and method for photogrammetric surgical localization |
6501818, | Nov 26 1997 | GE Medical Systems Global Technology Company, LLC | Apparatus and methods for displaying computed tomography fluoroscopy images including data transfer provided over a network |
6514082, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional examination with collapse correction |
6529765, | Apr 21 1998 | NEUTAR L L C | Instrumented and actuated guidance fixture for sterotactic surgery |
6542579, | Sep 30 1998 | Canon Kabushiki Kaisha | X-ray photo-taking system, X-ray photo-taken image display method, and storage medium |
6546277, | Apr 21 1998 | NEUTAR L L C | Instrument guidance system for spinal and other surgery |
6603868, | Nov 24 1998 | Siemens Healthcare GmbH | Method and apparatus for processing and playback of images at a display monitor |
6633686, | Nov 05 1998 | Washington University | Method and apparatus for image registration using large deformation diffeomorphisms on a sphere |
6669635, | Oct 28 1999 | Surgical Navigation Technologies, Inc. | Navigation information overlay onto ultrasound imagery |
6690397, | Jun 05 2000 | ADVANCED NEUROMODULATION SYSTEMS, INC | System for regional data association and presentation and method for the same |
6720960, | Sep 14 2000 | Leica Microsystems Heidelberg GmbH | Method for the analysis and evaluation of at least three-dimensional specimen data |
6725080, | Mar 01 2000 | SURGICAL NAVIGATION TECHNOLOGIES, INC | Multiple cannula image guided tool for image guided procedures |
6838879, | Mar 23 2001 | Koninklijke Philips Electronics N V | Magnetic resonance imaging method for an angulated cut plane with respect to a reference frame |
6891963, | Jan 06 1999 | Hitachi Medical Corporation | Image display |
6906724, | Oct 17 2001 | lntel Corporation | Generating a shadow for a three-dimensional model |
6924804, | Sep 25 2001 | Intel Corporation | Reducing the resolution of bones in a three-dimensional model |
6968224, | Oct 28 1999 | Surgical Navigation Technologies, Inc. | Method of detecting organ matter shift in a patient |
6975318, | Jun 25 2002 | Intel Corporation | Polygon binning process for tile-based rendering |
6975897, | Mar 16 2001 | GE Medical Systems Global Technology Company, LLC | Short/long axis cardiac display protocol |
6978166, | Oct 07 1994 | SURGICAL NAVIGATION TECHNOLOGIES, INC | System for use in displaying images of a body part |
6980206, | Jun 07 2001 | Intel Corporation | Rendering a three-dimensional model using a dither pattern |
6982715, | Jul 26 2002 | Intel Corporation | Mesh compression process |
7061501, | Nov 07 2000 | Intel Corporation | Rendering a pencil-sketch image |
7072704, | Oct 19 1990 | St. Louis University | System for indicating the position of a surgical probe within a head on an image of the head |
7113191, | Oct 25 1999 | Intel Corporation | Rendering a silhouette edge |
7116330, | Feb 28 2001 | Intel Corporation | Approximating motion using a three-dimensional model |
7127091, | Dec 22 2000 | Koninklijke Philips Electronics N V | Method and apparatus for visualizing a limited part of a 3D medical image-point-related data set, through basing a rendered image on an intermediate region between first and second clipping planes, and including spectroscopic viewing of such region |
7139601, | Apr 26 1993 | Surgical Navigation Technologies, Inc.; St. Louis University | Surgical navigation systems including reference and localization frames |
7146297, | Mar 27 2002 | Intel Corporation | Detecting collisions of three-dimensional models |
7148887, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping |
7180523, | Mar 31 2000 | U S BANK NATIONAL ASSOCIATION, AS COLLATERAL AGENT | Trimming surfaces |
7190374, | Feb 28 2001 | Intel Corporation | Shading polygons from a three-dimensional model |
7194117, | Jun 29 1999 | Research Foundation of State University of New York, The | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
7209138, | Oct 29 1999 | Intel Corporation | Image processing |
7217276, | Apr 20 1999 | Surgical Navigational Technologies, Inc. | Instrument guidance method and system for image guided surgery |
7227924, | Oct 06 2000 | UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL, THE | Computed tomography scanning system and method using a field emission x-ray source |
7260250, | Sep 30 2002 | The United States of America as represented by the Secretary of the Department of Health and Human Services | Computer-aided classification of anomalies in anatomical structures |
7301547, | Mar 22 2002 | Intel Corporation | Augmented reality system |
7313430, | Aug 28 2003 | Medtronic Navigation, Inc. | Method and apparatus for performing stereotactic surgery |
7324104, | Sep 14 2001 | Research Foundation of State University of New York, The | Method of centerline generation in virtual objects |
7337098, | May 18 1998 | Rigaku Corporation | Diffraction condition simulation device, diffraction measurement system, and crystal analysis system |
7356367, | Jun 06 2001 | The Research Foundation of State University of New York | Computer aided treatment planning and visualization with image registration and fusion |
7362334, | Jun 05 2000 | Advanced Neuromodulation Systems, Inc. | System for regional data association and presentation and method for the same |
7366562, | Oct 17 2003 | SURGICAL NAVIGATION TECHNOLOGIES, INC | Method and apparatus for surgical navigation |
7430271, | Nov 13 2000 | Digitome Corporation | Ray tracing kernel |
7474776, | Sep 16 1996 | The Research Foundation of State of New York | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
7477232, | Aug 28 2001 | VOLUME INTERACTIONS PTE LTD | Methods and systems for interaction with three-dimensional computer models |
7477768, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
7486811, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional virtual examination of objects, such as internal organs |
7496222, | Jun 23 2005 | General Electric Company | Method to define the 3D oblique cross-section of anatomy at a specific angle and be able to easily modify multiple angles of display simultaneously |
7502027, | Sep 13 1999 | Dassault Systemes SolidWorks Corporation | Electronic drawing viewer |
7515675, | Dec 07 2005 | MORPHO DETECTION, LLC | Apparatus and method for providing a near-parallel projection from helical scan data |
7525543, | Aug 09 2004 | SIEMENS HEALTHINEERS AG | High performance shading of large volumetric data using screen-space partial derivatives |
7542791, | Jan 30 2003 | Medtronic Navigation, Inc. | Method and apparatus for preplanning a surgical procedure |
7548241, | Jan 04 2002 | Intel Corporation | Determining a node path through a node graph |
7567834, | May 03 2004 | Medtronic Navigation, Inc | Method and apparatus for implantation between two vertebral bodies |
7570791, | Apr 25 2003 | Medtronic Navigation, Inc | Method and apparatus for performing 2D to 3D registration |
7574024, | Oct 02 2001 | Research Foundation of State University of New York, The | Centerline and tree branch skeleton determination for virtual objects |
7596256, | Sep 14 2001 | Research Foundation of State University of New York, The | Computer assisted detection of lesions in volumetric medical images |
7599730, | Nov 19 2002 | Medtronic Navigation, Inc | Navigation system for cardiac therapies |
7606613, | Mar 23 1999 | Medtronic Navigation, Inc | Navigational guidance via computer-assisted fluoroscopic imaging |
7612561, | Aug 11 2006 | Toshiba Medical Systems Corporation | Magnetic resonance diagnosing apparatus and its operating method |
7626589, | Dec 10 2003 | 3D Systems, Inc | Haptic graphical user interface for adjusting mapped texture |
7630750, | Feb 05 2001 | Research Foundation of State University of New York, The | Computer aided treatment planning |
7630753, | Feb 28 2002 | Medtronic Navigation, Inc. | Method and apparatus for perspective inversion |
7636595, | Oct 28 2004 | Medtronic Navigation, Inc. | Method and apparatus for calibrating non-linear instruments |
7646904, | Sep 30 2002 | The United States of America as represented by the Department of Health and Human Services | Computer-aided classification of anomalies in anatomical structures |
7657300, | Oct 28 1999 | Medtronic Navigation, Inc. | Registration of human anatomy integrated for electromagnetic localization |
7660623, | Jan 30 2003 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
7668285, | Feb 16 2004 | Toshiba Medical Systems Corporation | X-ray computed tomographic apparatus and image processing apparatus |
7689266, | Jul 22 2002 | Hitachi, LTD | Medical image diagnosis apparatus |
7697972, | Nov 19 2002 | Medtronic Navigation, Inc | Navigation system for cardiac therapies |
7706600, | Oct 02 2001 | The Research Foundation of State University of New York; Research Foundation of State University of New York, The | Enhanced virtual navigation and examination |
7751528, | Jul 19 2007 | UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL, THE | Stationary x-ray digital breast tomosynthesis systems and related methods |
7751865, | Oct 17 2003 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
7763035, | Dec 12 1997 | Medtronic Navigation, Inc. | Image guided spinal surgery guide, system and method for use thereof |
7786990, | Nov 22 2006 | Agfa Healthcare | Cursor mode display system and method |
7797032, | Oct 28 1999 | SURGICAL NAVIGATION TECHNOLOGIES, INC | Method and system for navigating a catheter probe in the presence of field-influencing objects |
7818044, | Oct 17 2003 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
7831082, | Jun 14 2000 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
7835497, | Feb 16 2007 | SIEMENS HEALTHINEERS AG | Method for automatic evaluation of scan image data records |
7835778, | Oct 16 2003 | Medtronic Navigation, Inc | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
7835784, | Sep 21 2005 | Medtronic Navigation, Inc | Method and apparatus for positioning a reference frame |
7840253, | Oct 17 2003 | Medtronic Navigation, Inc | Method and apparatus for surgical navigation |
7853305, | Apr 07 2000 | Medtronic Navigation, Inc. | Trajectory storage apparatus and method for surgical navigation systems |
7881770, | Mar 01 2000 | Medtronic Navigation, Inc. | Multiple cannula image guided tool for image guided procedures |
7889209, | Dec 10 2003 | 3D Systems, Inc | Apparatus and methods for wrapping texture onto the surface of a virtual object |
7925328, | Aug 28 2003 | Medtronic Navigation, Inc. | Method and apparatus for performing stereotactic surgery |
7928995, | Jun 05 2000 | Advanced Neuromodulation Systems, Inc. | System for regional data association and presentation and method for the same |
7953260, | Jun 09 2006 | CranioSim Solutions, Inc.; CRANIOSIM SOLUTIONS, INC | Predicting movement of soft tissue of the face in response to movement of underlying bone |
7953471, | May 03 2004 | Medtronic Navigation, Inc. | Method and apparatus for implantation between two vertebral bodies |
7971341, | Oct 17 2003 | Medtronic Navigation, Inc. | Method of forming an electromagnetic sensing coil in a medical instrument for a surgical navigation system |
7974677, | Jan 30 2003 | Medtronic Navigation, Inc. | Method and apparatus for preplanning a surgical procedure |
7978191, | Sep 24 2007 | Dolphin Imaging Systems, LLC | System and method for locating anatomies of interest in a 3D volume |
7991108, | Aug 18 2008 | Toshiba Medical Systems Corporation | Medical image processing apparatus, ultrasound imaging apparatus, X-ray CT (computed tomography) apparatus, and method of processing medical image |
7996064, | Mar 23 1999 | Medtronic Navigation, Inc. | System and method for placing and determining an appropriately sized surgical implant |
7998062, | Mar 29 2004 | Covidien LP | Endoscope structures and techniques for navigating to a target in branched structure |
8046052, | Nov 19 2002 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
8046053, | Oct 07 1994 | System and method for modifying images of a body part | |
8057407, | Oct 28 1999 | Medtronic Navigation, Inc. | Surgical sensor |
8060185, | Nov 19 2002 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
8074662, | Oct 28 1999 | Medtronic Navigation, Inc. | Surgical communication and power system |
8105339, | Dec 12 1997 | Sofamor Danek Holdings, Inc. | Image guided spinal surgery guide system and method for use thereof |
8112292, | Apr 21 2006 | Medtronic Navigation, Inc | Method and apparatus for optimizing a therapy |
8135566, | Jan 04 2002 | Intel Corporation | Determining a node path through a node graph |
8155262, | Apr 25 2005 | The University of North Carolina at Chapel Hill; Xintek, Inc. | Methods, systems, and computer program products for multiplexing computed tomography |
8160337, | Oct 11 2004 | Koninklijke Philips Electronics N V | Imaging system for the generation of high-quality X-ray projections |
8165658, | Sep 26 2008 | Medtronic, Inc | Method and apparatus for positioning a guide relative to a base |
8174535, | Dec 10 2003 | 3D Systems, Inc | Apparatus and methods for wrapping texture onto the surface of a virtual object |
8175681, | Dec 16 2008 | Medtronic Navigation Inc. | Combination of electromagnetic and electropotential localization |
8189890, | Sep 30 2002 | The United States of America as represented by the Secretary of the Department of Health and Human Services | Computer-aided classification of anomalies in anatomical structures |
8189893, | May 19 2006 | North Carolina State University | Methods, systems, and computer program products for binary multiplexing x-ray radiography |
8200314, | Jan 27 1993 | British Telecommunications public limited company | Surgical navigation |
8208708, | Mar 30 2006 | Koninklijke Philips Electronics N V | Targeting method, targeting device, computer readable medium and program element |
8217937, | Mar 28 2007 | The Aerospace Corporation | Isosurfacial three-dimensional imaging system and method |
8239001, | Oct 17 2003 | Medtronic Navigation, Inc | Method and apparatus for surgical navigation |
8271069, | Oct 17 2003 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
8290572, | Oct 28 1999 | Medtronic Navigation, Inc. | Method and system for navigating a catheter probe in the presence of field-influencing objects |
8320653, | Jun 14 2000 | Medtronic Navigation, Inc. | System and method for image based sensor calibration |
8331635, | Jul 06 2006 | University of South Florida | Cartesian human morpho-informatic system |
8358739, | Sep 03 2010 | UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL, THE | Systems and methods for temporal multiplexing X-ray imaging |
8359730, | Oct 17 2003 | Medtronic Navigation, Inc. | Method of forming an electromagnetic sensing coil in a medical instrument |
8380288, | Apr 29 2005 | Vanderbilt University | System and methods of using image-guidance for providing an access to a cochlear of a living subject |
8401616, | Nov 19 2002 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
8401620, | Oct 16 2006 | Perfint Healthcare Private Limited | Needle positioning apparatus and method |
8428326, | Oct 23 2008 | Immersion Corporation | Systems and methods for ultrasound simulation using depth peeling |
8428336, | Dec 01 1998 | Hitachi, Ltd. | Inspecting method, inspecting system, and method for manufacturing electronic devices |
8452061, | Nov 30 2005 | The Research Foundation of State University of New York | Electronic colon cleansing method for virtual colonoscopy |
8452068, | Jun 06 2008 | Covidien LP | Hybrid registration method |
8456484, | Dec 10 2003 | 3D Systems, Inc | Apparatus and methods for wrapping texture onto the surface of a virtual object |
8467589, | Jun 06 2008 | Covidien LP | Hybrid registration method |
8467851, | Sep 21 2005 | Medtronic Navigation, Inc. | Method and apparatus for positioning a reference frame |
8467853, | Nov 19 2002 | Medtronic Navigation, Inc. | Navigation system for cardiac therapies |
8471846, | Nov 28 2008 | AGFA HEALTHCARE N V | Method and apparatus for determining medical image position |
8473026, | Sep 15 1994 | STRYKER EUROPEAN HOLDINGS I, LLC | System for monitoring a position of a medical instrument with respect to a patient's body |
8473032, | Jun 03 2008 | Covidien LP | Feature-based registration method |
8494613, | Aug 31 2009 | Medtronic, Inc. | Combination localization system |
8494614, | Aug 31 2009 | Regents of the University of Minnesota; Medtronic, Inc. | Combination localization system |
8500451, | Jan 16 2007 | SIMBIONIX LTD | Preoperative surgical simulation |
8543338, | Jan 16 2007 | SIMBIONIX LTD | System and method for performing computerized simulations for image-guided procedures using a patient specific model |
8548565, | Oct 28 1999 | Medtronic Navigation, Inc. | Registration of human anatomy integrated for electromagnetic localization |
8549732, | Oct 17 2003 | Medtronic Navigation, Inc. | Method of forming an electromagnetic sensing coil in a medical instrument |
8600003, | Jan 16 2009 | UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL, THE | Compact microbeam radiation therapy systems and methods for cancer treatment and research |
8611492, | Jan 25 2011 | SIEMENS HEALTHINEERS AG | Imaging method for rotating a tissue region |
8611984, | Apr 08 2009 | Covidien LP | Locatable catheter |
8613748, | Nov 10 2010 | Perfint Healthcare Private Limited | Apparatus and method for stabilizing a needle |
8634897, | Apr 07 2000 | Medtronic Navigation, Inc. | Trajectory storage apparatus and method for surgical navigation systems |
8644907, | Oct 28 1999 | Medtronic Navigaton, Inc. | Method and apparatus for surgical navigation |
8660635, | Sep 29 2006 | Medtronic, Inc | Method and apparatus for optimizing a computer assisted surgical procedure |
8663088, | Sep 15 2003 | Covidien LP | System of accessories for use with bronchoscopes |
8675022, | May 16 2006 | EDDA TECHNOLOGY, INC | Joy-stick like graphical user interface to adjust 3D cross sectional plane in 3D volume |
8696548, | Apr 17 2002 | Covidien LP | Endoscope structures and techniques for navigating to a target in branched structure |
8696685, | Apr 17 2002 | Covidien LP | Endoscope structures and techniques for navigating to a target in branched structure |
8705690, | Jan 25 2011 | SIEMENS HEALTHINEERS AG | Imaging method with improved display of a tissue region, imaging device, and computer program product |
8706185, | Oct 16 2003 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation of a multiple piece construct for implantation |
8731641, | Dec 16 2008 | Medtronic Navigation, Inc. | Combination of electromagnetic and electropotential localization |
8764725, | Feb 09 2004 | Covidien LP | Directional anchoring mechanism, method and applications thereof |
8768437, | Aug 20 1998 | Sofamor Danek Holdings, Inc. | Fluoroscopic image guided surgery system with intraoperative registration |
8774901, | Oct 16 2006 | Perfint Healthcare Private Limited | Needle positioning apparatus and method |
8798227, | Oct 15 2010 | Toshiba Medical Systems Corporation | Medical image processing apparatus and X-ray computed tomography apparatus |
8824759, | Feb 27 2008 | Agency for Science, Technology and Research | Correcting axial tilt based on object positions in axial slices of three dimensional image |
8838199, | Apr 04 2002 | Medtronic Navigation, Inc. | Method and apparatus for virtual digital subtraction angiography |
8845655, | Apr 20 1999 | Medtronic Navigation, Inc. | Instrument guide system |
8905920, | Sep 27 2007 | Covidien LP | Bronchoscope adapter and method |
8932207, | Jul 10 2008 | Covidien LP | Integrated multi-functional endoscopic tool |
8934604, | Sep 28 2007 | Toshiba Medical Systems Corporation | Image display apparatus and X-ray diagnostic apparatus |
8995608, | Jan 15 2010 | The University of North Carolina at Chapel Hill | Compact microbeam radiation therapy systems and methods for cancer treatment and research |
9055881, | May 01 2005 | Covidien LP | System and method for image-based alignment of an endoscope |
9089261, | Sep 15 2003 | Covidien LP | System of accessories for use with bronchoscopes |
9113813, | Apr 08 2009 | Covidien LP | Locatable catheter |
9117258, | Jun 03 2008 | Covidien LP | Feature-based registration method |
9168102, | Jan 18 2006 | Medtronic Navigation, Inc | Method and apparatus for providing a container to a sterile environment |
9210398, | May 20 2010 | Samsung Electronics Co., Ltd. | Method and apparatus for temporally interpolating three-dimensional depth image |
9271803, | Jun 06 2008 | Covidien LP | Hybrid registration method |
9366757, | Apr 27 2009 | SAMSUNG MEDISON CO , LTD | Arranging a three-dimensional ultrasound image in an ultrasound system |
9504530, | Oct 28 1999 | Medtronic Navigation, Inc. | Method and apparatus for surgical navigation |
9542774, | Jan 04 2002 | Intel Corporation | Determining a node paththrough a node graph |
9575140, | Apr 03 2008 | Covidien LP | Magnetic interference detection system and method |
9597154, | Sep 29 2006 | Medtronic, Inc. | Method and apparatus for optimizing a computer assisted surgical procedure |
9612142, | Apr 27 2006 | General Electric Company | Method and system for measuring flow through a heart valve |
9642514, | Apr 17 2002 | Covidien LP | Endoscope structures and techniques for navigating to a target in a branched structure |
9659374, | Jun 03 2008 | Covidien LP | Feature-based registration method |
9665990, | Feb 08 2013 | EWOOSOFT CO , LTD | Image display to display 3D image and sectional images |
9668639, | Sep 27 2007 | Covidien LP | Bronchoscope adapter and method |
9675424, | Jun 04 2001 | Surgical Navigation Technologies, Inc. | Method for calibrating a navigation system |
9740989, | Mar 11 2013 | AUTODESK, Inc | Techniques for slicing a 3D model for manufacturing |
9754412, | Mar 11 2013 | AUTODESK, Inc | Techniques for slicing a 3D model for manufacturing |
9757087, | Feb 28 2002 | Medtronic Navigation, Inc. | Method and apparatus for perspective inversion |
9782136, | Jun 17 2014 | XINVIVO, INC | Intraoral tomosynthesis systems, methods, and computer readable media for dental imaging |
9784825, | Jul 15 2014 | Garmin Switzerland GmbH | Marine sonar display device with cursor plane |
9795348, | May 03 2010 | Koninklijke Philips Electronics N V | Medical viewing system and method for generating an angulated view of an object of interest |
9802364, | Oct 18 2011 | 3D Systems, Inc | Systems and methods for construction of an instruction set for three-dimensional printing of a user-customizableimage of a three-dimensional structure |
9867721, | Jan 30 2003 | Medtronic Navigation, Inc | Method and apparatus for post-operative tuning of a spinal implant |
9875339, | Jan 27 2011 | SIMBIONIX LTD | System and method for generating a patient-specific digital image-based model of an anatomical structure |
9907520, | Jun 17 2014 | XINVIVO, INC | Digital tomosynthesis systems, methods, and computer readable media for intraoral dental tomosynthesis imaging |
9986895, | Sep 27 2007 | Covidien LP | Bronchoscope adapter and method |
RE39133, | Sep 24 1997 | Medtronic Navigation, Inc | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
RE42194, | Sep 24 1997 | Medtronic Navigation, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
RE42226, | Sep 24 1997 | Medtronic Navigation, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
RE43328, | Nov 20 1997 | Medtronic Navigation, Inc | Image guided awl/tap/screwdriver |
RE43952, | Oct 05 1989 | Medtronic Navigation, Inc. | Interactive system for local intervention inside a non-homogeneous structure |
RE44305, | Sep 24 1997 | Medtronic Navigation, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
RE45509, | Sep 24 1997 | Medtronic Navigation, Inc. | Percutaneous registration apparatus and method for use in computer-assisted surgical navigation |
RE46409, | Nov 20 1997 | Medtronic Navigation, Inc. | Image guided awl/tap/screwdriver |
RE46422, | Nov 20 1997 | Medtronic Navigation, Inc. | Image guided awl/tap/screwdriver |
Patent | Priority | Assignee | Title |
4259725, | Mar 01 1979 | General Electric Company | Cursor generator for use in computerized tomography and other image display systems |
4858129, | Sep 30 1986 | Kabushiki Kaisha Toshiba | X-ray CT apparatus |
4882679, | Nov 27 1987 | PICKER INTENATIONAL, INC , A OHIO CORP | System to reformat images for three-dimensional display |
5250933, | Mar 02 1989 | Koninklijke Philips Electronics N V | Method and apparatus for the simultaneous display of one or more selected images |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 27 1991 | YANOF, JEFFREY H | PICKER INTERNATIONAL, INC A CORP OF NEW YORK | ASSIGNMENT OF ASSIGNORS INTEREST | 005936 | /0810 | |
Nov 27 1991 | MATTSON, RODNEY A | PICKER INTERNATIONAL, INC A CORP OF NEW YORK | ASSIGNMENT OF ASSIGNORS INTEREST | 005936 | /0810 | |
Nov 27 1991 | PATEL, PAULA I | PICKER INTERNATIONAL, INC A CORP OF NEW YORK | ASSIGNMENT OF ASSIGNORS INTEREST | 005936 | /0810 | |
Nov 29 1991 | Picker International, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Apr 10 1998 | M183: Payment of Maintenance Fee, 4th Year, Large Entity. |
Apr 08 2002 | M184: Payment of Maintenance Fee, 8th Year, Large Entity. |
May 30 2006 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Dec 06 1997 | 4 years fee payment window open |
Jun 06 1998 | 6 months grace period start (w surcharge) |
Dec 06 1998 | patent expiry (for year 4) |
Dec 06 2000 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 06 2001 | 8 years fee payment window open |
Jun 06 2002 | 6 months grace period start (w surcharge) |
Dec 06 2002 | patent expiry (for year 8) |
Dec 06 2004 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 06 2005 | 12 years fee payment window open |
Jun 06 2006 | 6 months grace period start (w surcharge) |
Dec 06 2006 | patent expiry (for year 12) |
Dec 06 2008 | 2 years to revive unintentionally abandoned end. (for year 12) |