A method and system are disclosed for generating enhanced images of multiple dimensional data using a depth-buffer segmentation process. The method and system operate in a computer system modify the image by generating a reduced-dimensionality image data set from a multidimensional image by formulating a set of projection paths through image points selected from the multidimensional image, selecting an image point along each projection path, analyzing each image point to determine spatial similarities with at least one other point adjacent to the selected image point in a given dimension, and grouping the image point with the adjacent point or spatial similarities between the points is found thereby defining the data set.

Patent
   RE43225
Priority
Apr 20 1999
Filed
Oct 10 2008
Issued
Mar 06 2012
Expiry
Apr 19 2020

TERM.DISCL.
Assg.orig
Entity
Large
2
17
EXPIRED<2yrs
0. 61. A method comprising:
generating a depth buffer array configured to associate a threshold intensity value and position of one or more representations of points along parallel projections in a multidimensional image data set;
segmenting the representations of points in a reduced-dimensionality image data set into one or more groups corresponding to the depth buffer array values, wherein the reduced-dimensionality image data set corresponds to the multidimensional image data set;
removing a subset of the one or more groups from the reduced-dimensionality image data set according to a number of points in a group; and
generating a two dimensional image based on remaining groups in the reduced-dimensionality image data set.
0. 38. An apparatus comprising:
means for creating a reduced-dimensionality (RD) image data set from a multidimensional (MD) image data set;
means for using the RD and MD image data sets to select a first set of points including an image point in the RD image data set, wherein individual points of the first set of points correspond to other individual points in a second set of points in the MD image data set;
means for using associated data in the RD and MD image data sets to generate an array comprising array values each corresponding to a distance measure along a projection path extending between a point in the first set of points and an associated point in the second set of points; and
means for indicating in the RD image data set at least one grouping of the image point and at least one point adjacent to the image point based on a similarity of at least one of a smoothness or roughness of the array values for the image point and the at least one point adjacent to the image point.
0. 63. A method for isolating image elements depicting an anatomical structure, comprising:
formulating a set of projection paths through points represented in a multidimensional image data set including the anatomical structure;
forming a set of representations of image points by selecting individual image points for the set along each projection path represented in the multidimensional image data set based on intensity values corresponding to the points represented in the multidimensional image data set;
determining whether each image point of the set of image points represented in the multidimensional image data set corresponds to an image element depicting the anatomical structure based on a smoothness and continuity of each image point represented in the multidimensional image data set; and
grouping image points of the set of image points represented in the multidimensional image data set that correspond to image elements represented in the multidimensional image data set depicting the at least one anatomical structure.
0. 31. An image processing method comprising:
generating a reduced-dimensionality (RD) image data set from a multidimensional (MD) image data set, the RD image data set representing a first set of points each corresponding to a point in a second set of points represented in the MD image data set;
using the RD and MD image data sets, determining distance measures for a plurality of projection paths each extending between corresponding points in the first and second sets of points;
using the RD image data set, calculating at least one of a smoothness or roughness value for a selected point in the first set of points by comparing a distance measure associated with the selected point to distance measures of other points in the first set of points that are within a predetermined proximity of the selected point; and
indicating in the RD image data set at least one grouping of the selected point with other points in the first set of points based on a predefined threshold similarity of the at least one of the smoothness or roughness values.
0. 45. An article of manufacture comprising a non-transitory computer-readable medium having stored thereon computer-executable instructions that configure a processing device to:
generate a reduced dimensionality (RD) image data set from a multidimensional (MD) image data set, the RD image data set comprising representations of individual points of a first set of points, the MD image data set comprising representations of a second set of points;
identify individual points of the first set of points each corresponding to different individual points of the second set of points, the corresponding points forming a point pair;
determine a spatial relationship between the corresponding points for each point pair by comparing spatial data associated with the first set of points to spatial data associated with the second set of points;
select a first image point in the first set of points based on intensity data, the first image point associated with a first image point pair; and
group points from the first set of points with the first image point to form a first group according to a comparison of a spatial relationship of the first image point pair to a spatial relationship of point pairs proximate the first image point pair.
0. 49. An apparatus comprising:
an imaging device configured to generate a multi-dimensional (MD) image data set representing an object; and
one or more image processors configured to:
generate a reduced-dimensionality (RD) image data set from the MD image data set,
wherein the RD image data set represents a first set of points each corresponding to a point in a second set of points represented in the MD image data set,
wherein each point in the second set of points is associated with a minimum or maximum intensity value along a representation of a projection path extending from a point in the first set of points, the corresponding points in the first and second sets of points forming a point pair,
using the RD and MD image data sets, identify an image point in the first set of points by comparing intensity values of corresponding points in the second set of points;
generate a buffer array comprising a value for each point in the first set of points, the value corresponding to each point pair associated with the first set of points, wherein the value is a distance measurement between corresponding points of the point pair;
calculate at least one of a smoothness or roughness value for points in the first set of points based on buffer array values; and
group particular points in the first set of points with the image point based on a predetermined threshold similarity of smoothness or roughness.
1. A method of grouping image data points based on smoothness or roughness values, the method comprising:
creating a reduced-dimensionality image data set from a multidimensional image data set;
selecting a first set of points in the reduced-dimensionality image data set, each point in the first set of points having a corresponding point in a second set of points in the multidimensional image data set;
defining a unique projection path for each point in the first set of points, the projection path extending in a direction Z from a point in the first set of points through the corresponding point in the second set of points;
determining a distance measure for a first point in the first set of points, the distance measure being the distance along the projection path in the Z direction between the first point in the first set of points and the first point's corresponding point in the second set of points;
determining the distance measures for multiple points, including an image point, in the first set of points;
calculating a smoothness or roughness value for the image point in the first set of points by comparing the distance measure of the image point to the distance measures of other points in the first set of points; and
grouping the image point with similar points in the first set of points, each of said similar points having both a smoothness or roughness value related to the smoothness or roughness value of the image point.
0. 57. A system comprising:
an imaging device configured to generate a three dimensional (3d) image data set representing a 3d image; and
one or more image processors configured to;
generate a two dimensional (2D) image data set representing a 2D image generated from the 3d image data set, wherein the 2D image data set represents a plurality of 2D points on a plane bisecting a plurality of 3d points represented in the 3d image data set;
assigning an intensity value to each of the plurality of 2D points represented in the 2D image data set, wherein the intensity values are each associated with a minimum or maximum intensity value of a corresponding 3d point represented in the 3d image data set,
wherein the representation of the corresponding 3d point is along a representation of a projection path including a 2D point of the plurality of 2D points and the corresponding 3d point represented in the 2D and 3d image data sets;
generate a depth buffer array comprising depth measures for each of the plurality of 2D points represented in the 2D image data set,
wherein the depth measures for each of the plurality of 2D points are measured with respect to the corresponding 3d point represented in the 3d image data set;
compare one or more intensity values and depth measures; and
group the one or more 2D points represented in the 2D image data set into one or more anatomical structure groups and one or more background groups based on the comparison.
2. The method according to claim 1, wherein the smoothness or roughness value of the image point is determined using a least squares fit of the distance measures of points in proximity to the image point.
3. The method according to claim 1, wherein a Z-buffer array comprises the distance measures of multiple points in the first set of points.
4. The method according to claim 1, further comprising converting grouped and ungrouped points into a multi-dimensional image.
5. The method according to claim 4, further comprising performing region growing within the multi-dimensional image.
6. The method according to claim 5, further comprising hollowing out the multi-dimensional image.
7. The method according to claim 6, wherein the hollowing out comprises removing each pixel from a group that is surrounded on each side by a pixel from said group.
8. The method according to claim 1, further comprising displaying an image of grouped and ungrouped image points.
9. The method according to claim 1, wherein the multidimensional image data set is a magnetic resonance derived image set.
10. The method according to claim 1, wherein the multidimensional image data set is a computed tomography derived image set.
11. The method according to claim 1, further comprising compensating for variations in sensitivity along projection paths to enhance a projection image.
12. The method according to claim 3, further comprising applying a process to the buffer array to enhance the Z buffer array based upon expected properties of adjacent points in the buffer array.
13. The method according to claim 12, wherein the process comprises measuring array element roughness in a plurality of directions around each array element in the Z buffer array.
14. The method according to claim 1, wherein the projection paths are curvilinear.
15. The method according to claim 1, wherein the projection paths are divergent from a point of origin.
16. The method according to claim 1, wherein the proximity is defined as being no more than two image element positions from the image point.
17. The method according to claim 1, wherein the proximity is based on point adjacency.
18. The method according to claim 1, further comprising manipulating image groups for enhanced display.
19. The method according to claim 18, wherein the image manipulation consists of hollowing image structures.
20. The method according to claim 19, wherein the hollowing step comprises removing voxels in a group that are surrounded in multiple directions by adjacent voxels in the same group.
21. The method according to claim 1, further comprising displaying a resulting image.
22. The method according to claim 21, wherein the display comprises summation of the multi-dimensional image along projection lines.
23. The method according to claim 21, wherein the display comprises shading volume surfaces.
24. The method according to claim 1, further identifying bifurcations or branches of groups segmented from the multidimensional image.
25. The method of claim 1, wherein a corresponding point in the second set of points comprises a maximum intensity value along the corresponding point's projection path.
26. The method of claim 1, wherein a corresponding point in the second set of points comprises a minimum intensity value along the corresponding point's projection path.
27. The method of claim 1, wherein a corresponding point in the second set of points comprises a value above or below a predefined value.
28. The method of claim 27, wherein the corresponding point in the second set of points comprises an intensity value above an average background value.
29. The method of claim 28, wherein the corresponding point in the second set of points comprises an intensity value more than two standard deviations above the average background value.
30. The method according to claim 1, wherein the smoothness or roughness value of the image point is determined using chi-square values of the fit of the distance measures of points in proximity to the image point.
0. 32. The method according to claim 31, further comprising storing the distance measures in a depth buffer array.
0. 33. The method according to claim 31, further comprising:
identifying in the MD image data set, representations of a subset of points from the second set of points that correspond to the grouping of the selected point with other points in the first set of points;
using the MD image data set, determining an intensity of points neighboring each point of the subset of points in the second set of points; and
indicating in the MD image data set, an addition of the neighboring points to the subset of points represented in the MD image data set in the second set of points according to the intensity.
0. 34. The method according to claim 33, further comprising removing representations of the subset of points from the MD image data set based at least in part on a number of points in the subset.
0. 35. The method according to claim 31 further comprising applying maximum intensity projection processing to the MD image data set to generate the RD image data set.
0. 36. The method according to claim 31, wherein the MD image data set comprises three dimensional (3d) image data.
0. 37. The method according to claim 31, further comprising displaying an image formed by summing the MD image data set along the projection path.
0. 39. The apparatus according to claim 38, wherein selecting the image point is based on a comparison of brightness intensity values corresponding to the individual points in the RD image data set.
0. 40. The apparatus according to claim 38, further comprising means for displaying the first set of points.
0. 41. The apparatus according to claim 38, wherein the at least one of a smoothness or roughness value is based on a least squares fit analysis of a predetermined number of the array values of the array, wherein the predetermined number of the array values are centered by an array value corresponding to the image point and wherein the least squares fit analysis is performed on the array values in a predetermined number of directions.
0. 42. The apparatus according to claim 38, further comprising:
means for using associated data in the RD and MD image data sets to map the grouping in the first set of points to a subset of corresponding points in the second set of points; and
means for adding a new point to the subset from the second set of points based on a predetermined variance of a brightness intensity of the new point in comparison to an average brightness intensity of other points in the second set of points.
0. 43. The apparatus according to claim 42, wherein the predetermined variance is two standard deviations.
0. 44. The apparatus according to claim 38, wherein the means for creating the RD image data set from the MD image data set comprises a means for applying maximum intensity projection processing to the MD image data set to create the RD image data set.
0. 46. The article of manufacture according to claim 45, wherein the computer-executable instructions further configure the processing device to add points from the first set of points to the first group based on a comparison of spatial relationships of point pairs proximate different point pairs associated with the first group.
0. 47. The article of manufacture according to claim 45, wherein the computer-executable instructions further configure the processing device to:
compare a disparity between the spatial relationships of point pairs in the first group;
remove one or more points from the first group based on the disparity; and
select a second image point in the second set of points, wherein the second image point corresponds to the first image point based on intensity data of the second image point, the second image point associated with a second image point pair.
0. 48. The article of manufacture according to claim 47, wherein the computer-executable instructions further configure the processing device to group points proximate the second image point with the second image point to form a second group according to a comparison of a spatial relationship of the second image point pair to spatial relationships of point pairs associated with the points proximate the second image point.
0. 50. The apparatus according to claim 49, wherein the one or more image processors are further configured to:
using the RD and MD image data sets, identify additional image points among ungrouped points in the first set of points, in order according to intensity values of points in the second set of points corresponding to the ungrouped points in the first set of points; and
group other points in the first set of points with one of the additional image points based on the predetermined threshold similarity of smoothness or roughness to identify an additional group.
0. 51. The apparatus according to claim 49, wherein the one or more image processors are further configured to:
using the RD and MD image data sets, map the grouped points in the first set of points to a corresponding group of points in the second set of points;
determine a relative intensity of points in the second set of points neighboring the corresponding group of points, the intensity measured with respect to a reference background; and
add neighboring points to the corresponding group of points based on the relative intensity.
0. 52. The apparatus according to claim 51, wherein the one or more image processors are further configured to remove the group from the first set of points associated with the RD image data set, based on a threshold number of points defining a valid group.
0. 53. The apparatus according to claim 52, wherein the one or more image processors are further configured to cause a computer to display the resulting RD image data set.
0. 54. The apparatus according to claim 49, wherein the imaging device is a magnetic resonance imaging device.
0. 55. The apparatus according to claim 49, wherein the imaging device is a computed tomography imaging device.
0. 56. The apparatus according to claim 49, wherein the imaging device is an X-ray computed tomography angiography imaging device.
0. 58. The system according to claim 57, wherein the one or more image processors are further configured to:
group the 3d points represented in the 3d image data set by mapping 2D points represented in the 2D image data set corresponding to the one or more anatomical structure groups to corresponding to 3d points represented in the 3d image data set;
determine an intensity of 3d points neighboring grouped 3d points represented in the 3d image data set; and
add the neighboring 3d points to the grouped 3d points based on the intensity of the neighboring 3d points.
0. 59. The system according to claim 57, wherein the one or more image processors are further configured to remove representations of 2D points associated with the one or more background groups from the 2D image data set.
0. 60. The system according to claim 58, wherein the one or more image processors are further configured to remove one or more points from the grouped 3d points represented in the 3d image data set if the point to be removed is bordered on each side by a 3d point also belonging to grouped 3d points represented in the 3d image data set; and
further comprising a display configured to display a resulting 3d image.
0. 62. The method according to claim 61, further comprising:
mapping representations of points of the remaining groups to one or more corresponding representations of points in the multidimensional image data set to form one or more associated groups;
determining an intensity value for representations of points proximate the representations of points of the one or more associated groups in the multidimensional image data set, the intensity measured with respect to a reference background; and
adding the neighboring points to the group based on the intensity.
0. 64. The method according to claim 63, wherein determining further comprises analyzing the smoothness and continuity of the image point represented in the multidimensional image data set with respect to at least one adjacent image point also represented in the multidimensional image data set.
0. 65. The method according to claim 64, wherein analyzing further comprises comparing a similarity of continuity between the image point represented in the multidimensional image data set and the at least one adjacent image point also represented in the multidimensional image data set.
0. 66. The method according to claim 64, wherein analyzing further comprises comparing a similarity of smoothness between the image point represented in the multidimensional image data set and the at least one adjacent image point also represented in the multidimensional image data set.
0. 67. The method according to claim 66, wherein the at least one anatomical structure is a blood vessel.
0. 68. The method according to claim 1, the multidimensional image data set representing an image of an anatomical structure.

This application is
and brightness, Vi, is
Vi=A/(B+χi2)
where the constants A and B are empirically determined. The brightness values displayed in FIG. 7 tend to increase with the likelihood that the element is part of a vessel.

Other methods of measuring smoothness in the MIP Z-buffer could be used. The current method involves performing a low order least squares fit to a small number of image elements centered at a particular point. In the current embodiment, the process utilizes a first order fit along five points, and an example of applying the fit in four principal directions is shown in FIG. 6. Alternatively, a different number of points, a different order fit, and a different number of directions may be used in measuring local roughness or smoothness as desired by the system designer. Or the process could be omitted completely in some implementations.

As may be deduced from FIG. 6, only points that are contained within vessels will have a very low χ2 value. Points just outside the vessel edges generally experience erratic Z-buffer depth values at some point along each of the lines selected for the fit, resulting in high χ2 values. Because a vessel only requires smoothness in one direction, the process saves only the minimum value of the χ2 for each point. The minimum χ2 image is illustrated in. FIG. 9 illustrates an image where the brightness is inversely proportional to the minimum χ2 value of the least squares fits of short line segments along the principal directions.

It is then possible to look throughout the minimum χ2 image for regions of connected “smoothness” as performed in Step 222 of FIG. 2 in order to determine regions that are most likely to be vessels. Such regions are indicated by the grouped image elements of FIGS. 12 through 14. FIGS. 12 through 14 illustrates points in the Z-buffer that are grouped together based upon their low roughness values and proximity. The brightness shown in FIGS. 12 through 14 is proportional to the number of points in each group.

In the implementation of step 222, the process performs a grouping operation where each data point is considered in a specific order. The use of the MIP image implies that the bright parts of the original three dimensional image data were the most important. As such, the process performs the connectivity operations by selecting the brightest image element in the MIP image. This element is tested for a low value in the corresponding minimum χ2 array. If the minimum χ2 is below a predefined threshold, the point is selected as a possible vessel and the neighboring points in the 2D MIP image are tested. The brightest element and all of its neighbors and their neighbors, and so forth, which satisfy the connectivity criterion, are added to the group. The process then considers the next non-classified brightest point and determines which remaining neighboring points satisfy the connectivity criterion to form another group. The process continues until all points have been tested. To be connected in this embodiment of the invention, the minimum χ2 value must be below a given threshold and the Z-position must be within a small distance of the currently considered point. For example, in this illustration, the threshold value for χ2 equaled 1.0 and a step of +/−2 in Z was allowed. Other values may be selected by the designer according to the needs of the designer, but this step size recognizes that larger vessels occasionally project from different regions within the vessel, and the larger value improves the character of the connectedness. FIGS. 16 through 18 illustrates an image of the points that have been grouped based upon the proximity in “Z” and minimum roughness. The intensity of the display is proportional to the number of points in each group. It is shown that some vessels, although obviously “connected” in real life, are not connected by this implementation of the process (e.g. the middle cerebral artery is not connected to the trifurcation area). These disconnects happen because only one point in a thick vessel is projected by the MIP algorithm, and the thickness of the vessel allows larger jumps in “Z” than are accepted by the connectivity criteria used in this example.

Once all the contiguous groups of minimum roughness are determined in the MIP Z-buffer array, there are still many vessels that are crossed by other vessels and remain broken. Further, only one point per projection line is included in the vessel group. It is therefore useful to map each group of vessel points back to their original three dimensional image positions where a region growing algorithm is utilized for each point in each group to add additional points from the three dimensional image data to each vessel group as shown in step 224. In step 224, to complete the process of connecting all vessels, the groups are mapped back to the three dimensional image space, and all points from the two dimensional groups are considered again in the same order. All neighboring points in the 3D data are tested based upon their intensity being a selected (e.g. two standard deviations) above the average background value. Points that are two standard deviations or more above the average background would have projected in the MIP image had they not been obscured by other brighter structures, such as vessels. By adding these points to the groups, the process fills out the vessels in 3D and also connects most branches that were “disconnected” by being obscured by other vessels. An image of points connected by the process in step 224 is illustrated in 11.

As indicated by step 226, extraneous groups are removed from the set of grouped image points created in step 224. For example, noise groups can be removed automatically by eliminating all groups with very small numbers of elements. In this example, all groups with less than 45 elements were considered to be noise and were removed. All the major vessels subtend more than 45 elements, and very few noise groups have more than 45 elements (voxels). Where groups with large numbers of voxels are not of interest, such as regions of fat or bone or organ tissue in angiographic imaging, they can be eliminated by a manual step, that allows the user to select the groups to be removed and cause those groups not to display.

A qualitative evaluation of the results of the segmentation process can be performed by comparing the elements segmented as belonging to vessels with those seen in the original cross-sectional images. The DBS process performs well in classifying vessel and non-vessel voxels. The misclassifications consist of the smallest vessels that did not appear in the MIP and the centers of the carotid arteries that were dark in the original 3D data. FIGS. 12 through 14 show an example of points in the original 3D image that, as shown in FIG. 15, were manually classified as vessel and non-vessel by an independent observer. The points segmented by the DBS process of the present invention as vessels and non-vessels are shown as white and black regions of points in FIGS. 12 through 14. The manually classified points appear as white points in these same figures. FIG. 15 illustrates graphs of the performance of the segmentation based on the vessel and non-vessel classification of the segmented image of FIGS. 12 through 14 as performed by an independent observer. The graphs represents the vessel inclusion sensitivity as measured by the number of voxels in a group of a few hundred and of ×1000. Line 910 represents very small vessels means only see in local projections. Line 912 represents small vessels as seen in the MIP images. Line 914 represents the medium size secondary bifurcations, M2, A2. Line 916 represents large vessels, such as internal carotid and middle cerebral arteries.

Once all of the data elements have been classified by the DBS algorithm as groups of structures such as vessels, any of the previously mentioned variety of display techniques may be utilized, including MIPs and shaded surface renderings. By displaying the original MRA intensities of only those points classified as vessel, it is straightforward to perform densitometric summations along lines to the original data. The resulting images, as shown in FIGS. 16 through 18, look much like subtraction X-ray angiograms, but the dynamic range is so large that it is hard to portray the vessels at a single window/level setting on a video display. More specifically, FIG. 10 illustrates a stereo pair X-ray-like densitometric projection through DBS process images from the aneurysm study of FIG. 3. FIG. 16 is a stereo pair of the full intracranial circulation while FIG. 17 illustrates a stereo pair of vessels connected to the anterior and middle cerebral arteries. It is difficult to show the full dynamic range of intensities contained in the images of FIGS. 16 and 17. The dynamic range is reduced by removing all points internal to the DBS segmented structures. X-ray-like densitometric projection using the hollow DBS process in accordance with the present invention is shown in FIG. 18. In FIG. 18 some surface shading is also added to the X-ray-like projection image. In FIG. 18, the intracranial carotid artery element 1010 is visible below the aneurysm. Thus, the process manipulates the dynamic range of information for display as shown in step 228 of FIG. 2 to enhance the resolution of the vessel structures of interest during the display.

As just noted, one example of performing dynamic range reduction is to eliminate all data points within the image that have neighbors in every direction. This results in an image where the vessels appear as hollow tubes. Typical displays of densitometric projections through the hollow vessel images are shown in FIG. 18 and in FIGS. 19 through 26. The characterization of the image may be further modified by adding an amount of shading to each vessel surface thereby enabling the observer to determine whether the vessels are near or far, and the vessel orientation is more discernable as well. Once the final manipulation of the dynamic range is performed, the process, as shown in step 230, displays the processed image on an imaging device, such as a video monitor and/or prints the image on a printing apparatus. Optional display processes are contemplated that may yield more visual information. The present embodiment of the invention may not eliminate all noise, but the noise is reduced to the point that the useful data is easily recognized over these noise points.

FIGS. 19 through 22 illustrate stereo pair X-ray-like densitometric reprojections of the results of the DBS segmentation of points from the 3D image of a patient with a communicating artery (PCOM) aneurysm and a basilar tip aneurysm. FIG. 19 is the cranial view of the PCOM aneurysm while FIG. 20 is the caudal view. FIG. 21 highlights the posterior communicating artery aneurysm 1108 while FIG. 22 illustrates the highlight of a small basilar tip aneurysm, which is behind tip 1110 and is shown from two different orientation.

FIGS. 23 through 24 illustrates a stereoscopic X-ray-like densitometric reprojection through a renal artery study in accordance with the present invention. FIG. 23A shows the anterior view while FIG. 24 illustrates the posterior view.

Lastly, FIGS. 25 through 26 represents an example of CTA images acquired at relatively low resolution on a helical CT scanner. Segmentation is performed with the DBS process in accordance with the present invention. FIG. 25 depicts the axial collapse of the segmented data structures, while FIG. 26 illustrates shaded hollow-body reprojections of the aortic aneurysm.

The DBS process as described and presented in the present invention results in an image segmentation process that is readily applicable to magnetic resonance angiography (MRA), computed tomography angiography (CTA), rotational X-ray angiography (XRCTA), and other medical and non-medical applications. Since the DBS process is based upon the generic MIP algorithm, the application of the DBS process can be extended wherever the MIP algorithm is currently being used. The utility of the DBS process can be enhanced to include display options that allow the user to toggle between the DBS process and the preliminary MIP process, as well as other forms of volume rendering. The DBS process is also applicable to such fields as computer assisted screening and image interpretation based upon segmented anatomy.

The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims, rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.

Roberts, John A., Parker, Dennis L., Alexander, Andrew L., Chapman, Brian E.

Patent Priority Assignee Title
8503746, May 26 2008 Hitachi, LTD Medical image processing device, medical image processing method, and medical image processing program
9424680, Apr 16 2010 Koninklijke Philips N.V. Image data reformatting
Patent Priority Assignee Title
5201035, Jul 09 1990 The United States of America as represented by the Secretary of the Air Dynamic algorithm selection for volume rendering, isocontour and body extraction within a multiple-instruction, multiple-data multiprocessor
5271094, Jun 29 1990 International Business Machines Corporation Z-buffer quantization of three-dimensional lines
5402532, Mar 12 1991 International Business Machines Corporation Direct display of CSG expression by use of depth buffers
5544650, Apr 08 1988 AUTOCYTE NORTH CAROLINA, L L C Automated specimen classification system and method
5644689, Jan 13 1992 Hitachi, Ltd. Arbitrary viewpoint three-dimensional imaging method using compressed voxel data constructed by a directed search of voxel data representing an image of an object and an arbitrary viewpoint
5649061, May 11 1995 The United States of America as represented by the Secretary of the Army Device and method for estimating a mental decision
5757954, Sep 20 1994 TRIPATH IMAGING, INC Field prioritization apparatus and method
5825363, May 24 1996 Microsoft Technology Licensing, LLC Method and apparatus for determining visible surfaces
6028955, Feb 16 1996 Uber Technologies, Inc Determining a vantage point of an image
6031941, Dec 27 1995 Canon Kabushiki Kaisha Three-dimensional model data forming apparatus
6275718, Mar 23 1999 Bausch & Lomb Incorporated Method and apparatus for imaging and analysis of ocular tissue
6430309, Sep 15 1995 Hologic, Inc; Biolucent, LLC; Cytyc Corporation; CYTYC SURGICAL PRODUCTS, LIMITED PARTNERSHIP; SUROS SURGICAL SYSTEMS, INC ; Third Wave Technologies, INC; Gen-Probe Incorporated Specimen preview and inspection system
6456285, May 06 1998 Microsoft Technology Licensing, LLC Occlusion culling for complex transparent scenes in computer generated graphics
6674884, Aug 24 1996 EVIDENT SCIENTIFIC, INC Apparatus for remote control of a microscope
6750974, Apr 02 2002 Electro Scientific Industries, Inc Method and system for 3D imaging of target regions
6768811, Nov 20 2001 MANON BUSINESS INITIATION LTD System and method for analysis of imagery data
6919892, Aug 14 2002 AVAWORKS, INCORPROATED Photo realistic talking head creation system and method
///////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jan 07 2003University of UtahNATIONAL INSTITUTES OF HEALTH NIH , U S DEPT OF HEALTH AND HUMAN SERVICES DHHS , U S GOVERNMENT NIH DIVISION OF EXTRAMURAL INVENTIONS AND TECHNOLOGY RESOURCES DEITR CONFIRMATORY LICENSE SEE DOCUMENT FOR DETAILS 0271830633 pdf
Feb 21 2006ALEXANDER, ANDREW L University of UtahASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271830684 pdf
Feb 23 2006CHAPMAN, BRIAN E University of UtahASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271830684 pdf
Mar 21 2006PARKER, DENNIS L University of UtahASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271830684 pdf
Mar 21 2006ROBERTS, JOHN A University of UtahASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271830684 pdf
Mar 21 2006University of UtahUniversity of Utah Research FoundationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0271830759 pdf
Oct 10 2008University of Utah Research Foundation(assignment on the face of the patent)
Date Maintenance Fee Events
Mar 26 2014M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
May 21 2018REM: Maintenance Fee Reminder Mailed.
Nov 12 2018EXP: Patent Expired for Failure to Pay Maintenance Fees.


Date Maintenance Schedule
Mar 06 20154 years fee payment window open
Sep 06 20156 months grace period start (w surcharge)
Mar 06 2016patent expiry (for year 4)
Mar 06 20182 years to revive unintentionally abandoned end. (for year 4)
Mar 06 20198 years fee payment window open
Sep 06 20196 months grace period start (w surcharge)
Mar 06 2020patent expiry (for year 8)
Mar 06 20222 years to revive unintentionally abandoned end. (for year 8)
Mar 06 202312 years fee payment window open
Sep 06 20236 months grace period start (w surcharge)
Mar 06 2024patent expiry (for year 12)
Mar 06 20262 years to revive unintentionally abandoned end. (for year 12)