A method and system for providing a user interface for polyp annotation, segmentation, and measurement in computer tomography colonography (ctc) volumes is disclosed. The interface receives an initial polyp position in a ctc volume, and automatically segments the polyp based on the initial polyp position. In order to segment the polyp, a polyp tip is detected in the ctc volume using a trained 3D point detector. A local polar coordinate system is then fit to the colon surface in the ctc volume with the origin at the detected polyp tip. polyp interior voxels and polyp exterior voxels are detected along each axis of the local polar coordinate system using a trained 3D box. A boundary voxel is detected on each axis of the local polar coordinate system based on the detected polyp interior voxels and polyp exterior voxels by boosted 1D curve parsing using a trained classifier. This results in a segmented polyp boundary. The segmented polyp is displayed in the user interface, and a user can modify the segmented polyp boundary using the interface. The interface can measure the size of the segmented polyp in three dimensions. The user can also use the interface for polyp annotation in ctc volumes.
|
1. A method for providing an interface for polyp annotation, segmentation, and measuring in computed tomography colonography (ctc) volumes, comprising:
receiving an initial polyp location in a ctc volume;
automatically segmenting a polyp in the ctc volume based on the initial polyp location by:
detecting a polyp tip in a neighborhood of the initial polyp location,
fitting a local polar coordinate system to a colon surface in the ctc volume based on the detected polyp tip, the local polar coordinate system including a plurality of axes,
detecting polyp interior and polyp exterior voxels along each of the plurality of axes of the local polar coordinate system, and
segmenting a boundary of the polyp in the ctc volume by detecting a boundary voxel for each of the plurality of axes of the local polar coordinate system based on the detected polyp interior and polyp exterior voxels;
displaying the segmented polyp in the ctc volume in a user interface;
receiving a user input via the user interface to modify the segmented polyp; and
re-segmenting the segmented polyp based on the received user input.
19. An apparatus for providing an interface for polyp annotation, segmentation, and measuring in computed tomography colonography (ctc) volumes, comprising:
means for receiving an initial polyp location in a ctc volume;
means for segmenting a polyp in the ctc volume based on the initial polyp location, comprising:
means for detecting a polyp tip in a neighborhood of the initial polyp location,
means for fitting a local polar coordinate system to a colon surface in the ctc volume based on the detected polyp tip, the local polar coordinate system including a plurality of axes,
means for detecting polyp interior and polyp exterior voxels along each of the plurality of axes of the local polar coordinate system, and
means for segmenting a boundary of a polyp in the ctc volume by detecting a boundary voxel for each of the plurality of axes of the local polar coordinate system based on the detected polyp interior and polyp exterior voxels;
means for displaying the segmented polyp in the ctc volume;
means for receiving a user input to modify the segmented polyp; and
means for re-segmenting the segmented polyp based on the received user input.
25. A non-transitory computer readable medium encoded with computer executable instructions for providing an interface for polyp annotation, segmentation, and measuring in computed tomography colonography (ctc) volumes, the computer executable instructions defining steps comprising:
receiving an initial polyp location in a ctc volume;
automatically segmenting a polyp in the ctc volume based on the initial polyp location by:
detecting a polyp tip in a neighborhood of the initial polyp location,
fitting a local polar coordinate system to a colon surface in the ctc volume based on the detected polyp tip, the local polar coordinate system including a plurality of axes,
detecting polyp interior and polyp exterior voxels along each of the plurality of axes of the local polar coordinate system, and
segmenting a boundary of the polyp in the ctc volume by detecting a boundary voxel for each of the plurality of axes of the local polar coordinate system based on the detected polyp interior and polyp exterior voxels;
displaying the segmented polyp in the ctc volume in a user interface;
receiving a user input via the user interface to modify the segmented polyp; and
re-segmenting the segmented polyp based on the received user input.
2. The method of
receiving the ctc volume; and
receiving a user input via the user interface identifying the initial polyp location.
3. The method of
detecting the initial polyp location in the ctc volume using a computer aided detection (CAD) system.
4. The method of
detecting polyp tip voxel candidates from colon surface voxels of the ctc using a trained 3D point detector;
clustering the polyp tip candidate voxels into a plurality of clusters;
calculating a fitness score for each of the clusters based on probabilities determined by the 3D point detector for the polyp tip voxel candidates in each cluster;
determining the geometric mean of the cluster with the maximal fitness score; and
projecting the geometric mean to a voxel on the colon surface.
5. The method of
6. The method of
generating a colon surface model in the ctc volume; and
fitting the local polar coordinate system to the colon surface model with an origin of the local polar coordinate system at the detected polyp tip.
7. The method of
displaying the colon surface model in the user interface.
8. The method of
determining a probability of a voxel being in the polyp for each voxel along each axes of the local polar coordinate system using a trained 3D box detector.
9. The method of
10. The method of
detecting a boundary voxel for each axis of the local polar coordinate system based on the probabilities determined by the trained 3D box detector by boosted 1D curve parsing using a trained classifier; and
stacking the detected boundary voxels to generate the boundary of the polyp.
11. The method of
12. The method of
receiving a user input via the user interface adjusting a position of at least one boundary voxel of the segmented polyp boundary.
13. The method of
smoothing the segmented boundary with the adjusted position of the at least one boundary voxel.
14. The method of
receiving a user input via the user interface adjusting a position of the detected polyp tip.
15. The method of
smoothing the segmented boundary of the polyp.
16. The method of
displaying cutting planes of the ctc volume corresponding to the axes of the local polar coordinate system in the user interface.
17. The method of
measuring a size of the re-segmented polyp in three dimensions.
18. The method of
detecting first, second, and third polyp orthogonal axes in the re-segmented polyp.
20. The apparatus of
means for receiving a user input identifying the initial polyp location.
21. The apparatus of
means for receiving a user input adjusting a position of at least one boundary voxel of the segmented polyp boundary.
22. The apparatus of
means for receiving a user input adjusting a position of the detected polyp tip.
23. The apparatus of
means for displaying cutting planes of the ctc volume corresponding to the axes of the local polar coordinate system.
24. The apparatus of
means for measuring a size of the re-segmented polyp in three dimensions.
26. The non-transitory computer readable medium of
displaying cutting planes of the GIG volume corresponding to the axes of the local polar coordinate system in the user interface.
27. The non-transitory computer readable medium of
measuring a size of the re-segmented polyp in three dimensions.
|
This application claims the benefit of U.S. Provisional Application No. 60/974,102, filed Sep. 21, 2007, the disclosure of which is herein incorporated by reference.
The present invention relates to 3D computed tomography (CT) colonography, and more particularly, to providing a user interface for polyp annotation, segmentation, and measurement in 3D CT colonography.
Colon cancer is the number two cause of cancer death in men and women combined, but it is one of the most preventable cancers because doctors can identity and remove pre-cancerous growths known as polyps. 3D CT colonography (CTC), or virtual colonoscopy, is emerging as a powerful polyp screening tool because of its non-invasiveness, low cost, and high sensitivity. 3D CTC visualizes the colon, and allows a physician to navigate the colon to search for polyps. However, the physician needs to manually adjust the navigation speed and change the angle in order to see a polyp clearly. For example, a polyp may be hidden in a colonic fold and thus could be missed during the physician's visual inspection. Accordingly, there has been much research in computer aided detection (CAD) of polyps in CTC, and several such CAD systems have been proposed. Once a polyp is detected either manually or automatically, the polyp is measured and classified by a physician. However, there is a large variability in physician's measurements of polyps. Therefore, an accurate, consistent, and automated method for polyp measurement is desirable.
Polyp segmentation is defined as extracting as isolating a polyp from the colon wall at a given location. In addition to its significant value for polyp measurement in clinical practice, polyp segmentation is also important for computer aided detection of polyps. Polyp segmentation is a challenging task because polyps are abnormal growths from the colon wall and the “expected” segmentations are often a semantic, perceptual boundary with low imaging contrast support. Furthermore, there are multiple shape categories of polyps, such as sessile, pedunculated, flat, etc., with a large 3D shape/appearance variation. Conventional polyp segmentation methods utilize unsupervised segmentation or clustering in 3D data volumes. Such methods include, fuzzy clustering, deformable model or snakes, variational level-set method, and heuristic surface curvature constraints. These unsupervised approaches work well for up to 70% of polyps due to unclear polyp/nonpolyp boundaries, large within-class polyp appearance/shape variations, or limited heuristic shape assumptions. Accordingly, an automatic polyp segmentation method having increased accuracy is desirable.
The present invention provides a method and system for providing an interface for polyp annotation, segmentation, and measurement in 3D computed tomography colonography (CTC) volumes.
In one embodiment of the present invention, an initial polyp location in a CTC volume is received by a user interface. The interface automatically segments the polyp based on the initial polyp position and displays the segmented polyp. In order to segment the polyp, a polyp tip is detected in the CTC volume using a trained 3D point detector. A local polar coordinate system is then fit to the colon surface in the CTC volume with the origin at the detected polyp tip. Polyp interior voxels and polyp exterior voxels are detected along each axis of the local polar coordinate system using a trained 3D box. A boundary voxel is detected on each axis of the local polar coordinate system based on the detected polyp interior voxels and polyp exterior voxels by boosted 1D curve parsing using a trained classifier. The segmented polyp can be displayed in the user interface by displaying 2D cutting planes corresponding to axes of the local polar coordinate system. A user can modify the segmented polyp boundary using the interface, and the interface re-segments the polyp based on the user input. The interface can measure the size of the segmented polyp in three dimensions.
These and other advantages of the invention will be apparent to those of ordinary skill in the art by reference to the following detailed description and the accompanying drawings.
The present invention is directed to an interface for polyp annotation, segmentation, and measurement in 3D computed tomography colonography (CTC) volumes. Embodiments of the present invention are described herein to give a visual understanding of the polyp annotation, segmentation, and measurement. A digital image is often composed of digital representations of one or more objects (or shapes). The digital representation of an object is often described herein in terms of identifying and manipulating the objects. Such manipulations are virtual manipulations accomplished in the memory or other circuitry/hardware of a computer system. Accordingly, is to be understood that embodiments of the present invention may be performed within a computer system using data stored within the computer system.
Embodiments of the present invention are directed to a semi-automatic user interface for 3D polyp boundary annotation, segmentation and dimensioning measurement in 3D CTC. The user interface offers increased flexibility and freedom in terms of CTC volume 3D navigation by fitting a set of axes of polar coordinates onto user identified or automatically detected polyp locations. The traditional orthonormal (XY, YZ, ZX planes) 2D slices of a 3D CTC volume are significantly enhanced with 2D radial cut planes, which can be centered on any user selection point. The user interface can be used for semi-automatic user annotation with machine learning segmentation, as apposed to fully manual annotation in previous systems. The trained learning segmentation subsystem can automatically generate an initial polyp segmentation boundary, where experienced users can easily modify the polyp segmentation boundary using expert knowledge to improve the segmented polyp boundary. For less-experienced users, the automatic segmentation functionality can serve as a trainer to guide annotation. This not only eliminates the ambiguity of labeling polyps for new users, but also saves time to train them. By using this interface, users can modify the segmented polyp boundary by simply moving the output polyp boundary points to more accurate positions when necessary. The three axes of segmented 3D polyp surface can be automatically calculated. Conventional labeling systems can only find the largest polyp axis or select from multiple axis hypotheses by experts using a trail-and-compare strategy. Accordingly, the interface produces more accurate polyp size measurements than conventional labeling systems.
In addition to the features described above, the polyp annotation, segmentation, and measurement interface can also provide various other functions, such as user-controlled contour smoothing, contrast visibility controlling, easier access to CTC volume sequences, and separately staged polyp processing display for performance/error analysis and volume visualization under different modalities (i.e., original volume, voxel-based gradient surface volume, mesh-based surface volume).
The interface can be implemented as a graphical user interface. Such a graphical user interface can be implemented by a computer system executing computer executable instructions stored on a computer readable medium.
Returning to
At step 604, the polyp is automatically segmented in the CTC volume based on the initial polyp location. The interface utilizes machine-learning based segmentation with stored classifiers to segment a boundary of the polyp. The automatic polyp segmentation may be performed in response to a user input, such as a user selecting a specified control of then interface (e.g., polyp segmentation button 418 of
The 3D point detector can be trained as probabilistic boosting tree (PBT) classifier using axis-based steerable features. A PBT classifier is trained by recursively constructing a tree, where each of the nodes represents a strong classifier. Once the strong classifier of each node is trained, the input training data for the node is classified into two sets (positives and negatives) using the learned strong classifier. The two new sets are fed to left and right child nodes respectively to train the left and right child nodes. In this way, the PBT classifier is constructed recursively. The trained classifier (3D point detector) is denoted herein as PBT1. The trained 3D point detector (PBT1) is used to detect polyp tip voxel candidates from the surface voxels of the sub-volume. Accordingly, PBT1 determines a positive-class probability value prob(ν) (from 0 to 1) for each surface voxel v of the sub-volume. If prob(ν)>T1 for a voxel, where T1 is a first threshold value, then the voxel is classified as a polyp tip voxel candidate.
From all the detected polyp tip voxel candidates (S{ν}={ν}: prob(ν)>T1) in a volume (or sub-volume), one voxel is detected as the polyp tip voxel by clustering and centering. The first threshold value T1 can be determined based on receiver operating characteristics (ROC) of the trained PBT classifier (PBT1). Connected component analysis is applied to partition the set S{ν} into a list of n clusters C1{ν}, C2{ν}, . . . , Cn{ν}. For each of the clusters, a fitness score Pi=ΣνεC
Returning to
Returning to
The fitting of a local polar coordinate system, which is a spatially ordered collection of 1D curve axes, is used to represent the 3D polyp surface due to the flexibility and convenient formulation to parameterization. Compared to Conformal mapping, no user indentified 3D-2D correspondences are needed and there is no constraint on 3D surface with disk topology. Furthermore, this representation, which uses a collection of 1D curves as sampling coordinates to model the 3D colon surface shape, decreases the intrinsic dimensionality problem (i.e., 1D vs. 3D modeling). This decomposable formulation makes it feasible to construct or model a large variety of 3D shapes using an assembly of simple 1D curves that are learnable and flexible.
Returning to
At step 710, the polyp surface boundary is detected using boosted 1D curve parsing and stacking. In order to detect the polyp boundary, this step is treated as a statistical curve parsing problem of finding polyp/nonpolyp breaking points on 1D curves. Given the output probabilities from the polyp interior/exterior detection (step 212), another layer of boundary boosting learning is applied, based on the theory of stacked generality. This is equivalent to learning a new boundary point classifier in the embedded, semantic space learned by the 3D box detector (as a wrapped) in the previous step. The boundary point classifier is trained as a PBT using 1D curve parsing based features based on the probabilities determined for voxels by the 3D box detector, such as probability count features and probability gap features. The resulting classifier is denoted herein as PBT3.
Using the trained classifier PBT3, a new probability value Pij is calculated for each voxel νij on the local polar coordinate axes by evaluating forward and backward probability arrays Af(ρij) and Ab(ρij) output by PBT2. To determine a unique boundary point on any local polar coordinate axis i, νik
At step 712, the segmented polyp surface boundary is smoothed. Assuming that the selection of each μik
The method of
Returning to
At step 610, the user modifies the detected polyp boundary points on the local polar coordinate axes or the detected polyp tip using the interface. Accordingly, the interface receives a user input modifying the polyp boundary points or the detected polyp tip. As shown in
Once the user modifies either the detected polyp tip or the detected boundary points, the polyp is re-segmented in the CTC volume based on the modifications. If the polyp tip is modified, it is possible that the segmentation method can be performed again with the user modified polyp tip used in place of the result of the 3D point detector. If the user modifies one or more polyp boundary points on polar coordinate axes, it is possible that the modified boundary points be kept to define the polyp boundary with the non-modified boundary points, and the smoothing step of the automatic segmentation is performed to smooth the polyp boundary including the modified boundary points.
At step 612, once the user is satisfied with the polyp segmentation results that are displayed by the interface, the interface automatically measured the dimensions of the segmented polyp. It is possible that the dimensioning of the polyp be performed in response to a user input requesting polyp measurement. The polyp dimensions can be calculated by detecting three orthogonal polyp axes of the segmented polyp. After the polyp boundary is segmented for a polyp, a set of positive polyp surface voxels (with their associated probabilities ρ) inside the segmented polyp boundary can be enumerated. It is then straightforward to calculate the polyp dimensions. A pair of positive voxels ν11, ν12 are detected with the maximal Euclidean distance from each other to define a first polyp axis. Using the distance dist(ν11,ν12) as the estimated polyp size, other pairs of voxels ν2 , ν2 that are orthogonal to ν11, ν12 (i.e., (ν2′,ν2″)⊕(ν11,ν12)=0). A pair of voxels ν21,ν22 are then selected from the orthogonal voxels ν2,ν2 with the maximal Euclidean distance from each other as the second polyp axis. A third polyp axis is then calculated by similarly calculating a third orthogonal axis. The detected polyp axes give the size of the polyp in three dimensions. The polyp dimensions can be output by the interface.
In addition to being displayed by the interface, such as on a display device of a computer system, the segmented polyp boundary and the calculated polyp size measurements can also be output by being stored on a storage, memory, or other computer readable medium. A physician can use such polyp segmentation and measurement results for assisting with polyp annotation, as well as diagnostic and medical purposes, for example for polyp removal or diagnosis of colon cancer.
The above-described polyp annotation, segmentation, and measurement methods, as well as the above-described polyp annotation, segmentation, and measurement interface may be implemented on a computer using well-known computer processors, memory units, storage devices, computer software, and other components. A high level block diagram of such a computer is illustrated in
The foregoing Detailed Description is to be understood as being in every respect illustrative and exemplary, but not restrictive, and the scope of the invention disclosed herein is not to be determined from the Detailed Description, but rather from the claims as interpreted according to the full breadth permitted by the patent laws. It is to be understood that the embodiments shown and described herein are only illustrative of the principles of the present invention and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention. Those skilled in the art could implement various other feature combinations without departing from the scope and spirit of the invention.
Salganicoff, Marcos, Comaniciu, Dorin, Barbu, Adrian, Bogoni, Luca, Lu, Le, Wolf, Matthias, Lakare, Sarang
Patent | Priority | Assignee | Title |
8913831, | Jul 31 2008 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Perceptual segmentation of images |
9087259, | Jul 30 2010 | Koninklijke Philips Electronics N V | Organ-specific enhancement filter for robust segmentation of medical images |
9888905, | Sep 29 2014 | Toshiba Medical Systems Corporation | Medical diagnosis apparatus, image processing apparatus, and method for image processing |
Patent | Priority | Assignee | Title |
5458111, | Sep 06 1994 | WILLIAM C BOND; STAFFORD, THOMAS P | Computed tomographic colonoscopy |
6212420, | Mar 13 1998 | University of Iowa Research Foundation | Curved cross-section based system and method for gastrointestinal tract unraveling |
6331116, | Sep 16 1996 | Research Foundation of State University of New York, The | System and method for performing a three-dimensional virtual segmentation and examination |
6514082, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional examination with collapse correction |
6928314, | Jan 23 1998 | Mayo Foundation for Medical Education and Research | System for two-dimensional and three-dimensional imaging of tubular structures in the human body |
6947784, | Apr 07 2000 | General Hospital Corporation, The | System for digital bowel subtraction and polyp detection and related techniques |
7148887, | Sep 16 1996 | The Research Foundation of State University of New York | System and method for performing a three-dimensional virtual segmentation and examination with optical texture mapping |
7260250, | Sep 30 2002 | The United States of America as represented by the Secretary of the Department of Health and Human Services | Computer-aided classification of anomalies in anatomical structures |
7333644, | Mar 11 2003 | SIEMENS HEALTHINEERS AG | Systems and methods for providing automatic 3D lesion segmentation and measurements |
7346209, | Sep 30 2002 | BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNVERSITY, THE | Three-dimensional pattern recognition method to detect shapes in medical images |
7369638, | Jul 11 2003 | Siemens Medical Solutions USA, Inc | System and method for detecting a protrusion in a medical image |
7379572, | Oct 16 2001 | University of Chicago | Method for computer-aided detection of three-dimensional lesions |
7440601, | Oct 10 2003 | Dolby Laboratories Licensing Corporation | Automated identification of ileocecal valve |
7454045, | Oct 10 2003 | HEALTH AND HUMAN SERVICES, UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY | Determination of feature boundaries in a digital representation of an anatomical structure |
7492968, | Sep 07 2004 | SIEMENS HEALTHINEERS AG | System and method for segmenting a structure of interest using an interpolation of a separating surface in an area of attachment to a structure having similar properties |
7630529, | Apr 07 2000 | The General Hospital Corporation | Methods for digital bowel subtraction and polyp detection |
7634133, | Mar 04 2004 | SIEMENS HEALTHINEERS AG | Segmentation of structures based on curvature slope |
20060239552, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 05 2008 | Siemens Medical Solutions USA, Inc. | (assignment on the face of the patent) | / | |||
Oct 02 2008 | LU, LE | Siemens Corporate Research, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0090 | |
Oct 02 2008 | COMANICIU, DORIN | Siemens Corporate Research, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0090 | |
Oct 03 2008 | WOLF, MATTHIAS | Siemens Medical Solutions USA, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0085 | |
Oct 03 2008 | LAKARE, SARANG | Siemens Medical Solutions USA, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0085 | |
Oct 03 2008 | BOGONI, LUCA | Siemens Medical Solutions USA, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0085 | |
Oct 03 2008 | SALGANICOFF, MARCOS | Siemens Medical Solutions USA, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0085 | |
Oct 21 2008 | BARBU, ADRIAN | Siemens Corporate Research, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 021840 | /0097 | |
Apr 03 2009 | Siemens Corporate Research, Inc | Siemens Medical Solutions USA, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 022506 | /0495 |
Date | Maintenance Fee Events |
Jul 15 2015 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Oct 21 2019 | REM: Maintenance Fee Reminder Mailed. |
Apr 06 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 28 2015 | 4 years fee payment window open |
Aug 28 2015 | 6 months grace period start (w surcharge) |
Feb 28 2016 | patent expiry (for year 4) |
Feb 28 2018 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 28 2019 | 8 years fee payment window open |
Aug 28 2019 | 6 months grace period start (w surcharge) |
Feb 28 2020 | patent expiry (for year 8) |
Feb 28 2022 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 28 2023 | 12 years fee payment window open |
Aug 28 2023 | 6 months grace period start (w surcharge) |
Feb 28 2024 | patent expiry (for year 12) |
Feb 28 2026 | 2 years to revive unintentionally abandoned end. (for year 12) |