An apparatus for detecting a lesion is provided. The apparatus includes an extracting unit configured to extract at least one tissue region from an image of tissue regions, a setting unit configured to set at least one of the at least one extracted tissue region as a lesion detection candidate region, and a detecting unit configured to detect a lesion from the lesion detection candidate region.
|
10. A method for detecting a lesion, the method comprising:
partitioning an image into a plurality of regions by performing an image segmentation;
comparing feature information of the partitioned plurality of regions to feature information of at least one tissue region;
determining at least one region corresponding to a pre-specified type of at least one tissue region having a higher chance of having lesions than another tissue region among the partitioned plurality of regions as a lesion detection candidate region; and
detecting a lesion from the determined lesion detection candidate region, the detection being performed on the determined lesion detection candidate region.
1. An apparatus for detecting a lesion, the apparatus comprising:
a memory configured to store instructions; and
at least one processor, that upon executing the stored instructions, is configured to:
partition an image into a plurality of regions by performing an image segmentation,
compare feature information of the partitioned plurality of regions to feature information of at least one tissue region,
determine at least one region corresponding to a pre-specified type of at least one tissue region having a higher chance of having lesions than another tissue region among the partitioned plurality of regions as a lesion detection candidate region, and
detect a lesion from the determined lesion detection candidate region,
wherein the detection is performed on the determined lesion detection candidate region.
18. A computer program product comprising a computer readable storage medium having a computer readable program stored therein, wherein the computer readable program, when executed on a computing device, causes the computing device to:
partition an image into a plurality of regions by performing an image segmentation;
compare feature information of the partitioned plurality of regions to feature information of at least one tissue region;
determine at least one region corresponding to a pre-specified type of at least one tissue region having a higher chance of having lesions than another tissue region among the partitioned plurality of regions as a lesion detection candidate region; and
detect a lesion from the determined lesion detection candidate region,
wherein the detection is performed on the determined lesion detection candidate region.
2. The apparatus of
3. The apparatus of
4. The apparatus of
partition the image of the breast into a plurality of regions by performing an image segmentation on the image of the breast, and
compare feature information of the partitioned plurality of regions of the image of the breast with feature information of the mammary glandular tissue region to extract the mammary glandular tissue region from the image of the breast.
5. The apparatus of
partition the image of the breast into a plurality of regions by performing an image segmentation on the image of the breast, and
compare feature information of the partitioned plurality of regions with feature information of a plurality of breast tissue regions.
6. The apparatus of
extract, from the image of the breast, a subcutaneous fat tissue region and a pectoralis muscle region, and
determine, from the image of the breast, a region between the subcutaneous fat tissue region and the pectoralis muscle region as the mammary glandular tissue region.
7. The apparatus of
8. The apparatus of
feature information of the subcutaneous fat tissue region indicates an upper location of the image, a darker brightness compared to other regions, and a round shape;
feature information of the pectoralis muscle region indicates a lower location of the image and a band-like texture having a uniform direction; and
feature information of the mammary glandular tissue region indicates a location between the subcutaneous fat tissue region and the pectoralis muscle region, and a small spot-like texture.
9. The apparatus of
11. The method of
12. The method of
partitioning the image of the breast into a plurality of regions by performing an image segmentation on the image of the breast; and
comparing feature information of the partitioned plurality of regions of the image of the breast with feature information of the mammary glandular tissue region.
13. The method of
performing an image segmentation on the image of the breast to partition the image of the breast into a plurality of regions; and
comparing feature information of the partitioned plurality of regions with feature information of a plurality of breast tissue regions.
14. The method of
extracting a subcutaneous fat tissue region and a pectoralis muscle region from the image of the breast; and
determining, from the image of the breast, a region between the subcutaneous fat tissue region and the pectoralis muscle region as the mammary glandular tissue region.
15. The apparatus of
at least one image acquisition sensor configured to photograph an inside of an organism to acquire the image,
wherein the at least one processor is further configured to determine a name of disease based on the lesion detected by the at least one processor.
16. The apparatus of
17. The apparatus of
|
This application claims the benefit under 35 U.S.C. §119(a) of Korean Patent Application No. 10-2011-0073398, filed on Jul. 25, 2011, the entire disclosure of which is incorporated by reference for all purposes.
1. Field
The following description relates to an apparatus and a method for detecting lesions and a lesion diagnosis apparatus.
2. Description of the Related Art
With the advancement of surgery techniques, different kinds of minimum invasive surgeries have been developed. A minimum invasive surgery process represents a surgery method in which a medical operation may be performed by approaching a lesion using surgical instruments without incising skin and muscle tissues. The surgical instruments may include a syringe or a catheter. The medical operation may include a medicine injection, removal of lesions, appliance insertion etc. In order to perform the minimum invasive surgery process, doctors needs to locate the lesion. Also, in order to diagnose a disease, the doctors may need to determine the size, shape and location of the lesion.
Various medical imaging equipment have been developed that can aid in the detection of the size, shape and location of the lesion. These medical imaging equipment include a Computed Tomography (CT) system, a Magnetic Resonance Imaging (MRI) system, a Positron Emission Tomography (PET) system, a Single Photon Emission Computed Tomography (SPECT), etc.
However, it may be difficult to precisely extract a lesion since the images produced by theses medical imaging equipment are typically of poor quality. Accordingly, a need exists for a technology capable of precisely extracting a lesion.
According to an aspect, an apparatus for detecting a lesion is provided. The apparatus includes an extracting unit configured to extract at least one tissue region from an image of tissue regions, a setting unit configured to set at least one of the at least one extracted tissue region as a lesion detection candidate region, and a detecting unit configured to detect a lesion from the lesion detection candidate region.
The extracting unit may extract the at least one tissue region from the image by comparing feature information of the image to feature information of the tissue regions.
The extracting unit may partition the image into a plurality of regions by performing an image segmentation on the image and compare feature information of the partitioned region with feature information of the tissue region to extracts the at least one tissue region from the image.
Each of the feature information of the image and the feature information of the tissue regions may be represented by brightness, color, texture, relative location and shape, or any combination thereof.
The image may correspond to an image of a breast, and the lesion detection candidate region corresponds to a mammary glandular tissue region.
The extracting unit may compare feature information of the image of the breast with feature information of the mammary glandular tissue region to extract the mammary glandular tissue region from the image of the breast.
The extracting unit may extract breast tissue regions from the image of the breast by partitioning the image of the breast into a plurality of regions by performing an image segmentation on the image of the breast and comparing feature information of the partitioned region with feature information of the tissue regions.
The extracting unit may extract a subcutaneous fat tissue region and a pectoralis muscle region and extract a region between the subcutaneous fat tissue region and the pectoralis muscle region as the mammary glandular tissue region.
Breast tissue regions of the image of the breast may include a subcutaneous fat tissue region, the mammary glandular tissue region and a pectoralis muscle region.
Feature information of the subcutaneous fat tissue region may indicate an upper location of the image, a darker brightness compared to other regions and a round shape, feature information of the pectoralis muscle region may indicate a lower location of the image and a band-like texture having a uniform direction, and feature information of the mammary glandular tissue region may indicate a location between subcutaneous fat tissue region and the pectroalis muscle region and a small spot like texture.
The extracting unit may perform a morphology operation to remove noise included in an image corresponding to the extracted tissue region.
The extracting unit may include a gabor filter, a spatial gray level dependence (SGLD) filter, or a wavelet filter.
The extracting unit may utilize a window having a size of N*N or N*M to extract feature information of the image.
In another aspect, a method for detecting a lesion is provided. The method includes extracting at least one tissue region from an image of tissue regions, setting at least one of the at least one extracted tissue region as a lesion detection candidate region, and detecting a lesion from the lesion detection candidate region.
The extracting of the at least one tissue region from the image may include comparing feature information of the image with feature information of the tissue regions.
The extracting of the at least one tissue region from the image may include performing an image segmentation on the image to partition the image into a plurality of regions, and comparing feature information of the partitioned region with feature information of the tissue regions.
The image may be an image of a breast, and the lesion detection candidate region may be a mammary glandular tissue region.
The extracting of the at least one tissue region from the image may include comparing feature information of the image of the breast with feature information of the mammary glandular tissue region to extract only the mammary glandular tissue region from the image of the breast.
The extracting of the at least one tissue region from the image includes extracting breast tissue regions from the image of the breast, the extracting breast tissue regions from the image of the breast may include performing an image segmentation on the image of the breast to partition the image of the breast into a plurality of regions, and comparing feature information of the partitioned region with feature information of the mammary glandular tissue region.
The extracting of the at least one tissue region from the image includes extracting a subcutaneous fat tissue region and a pectoralis muscle region from the image of the breast, and extracting a region between the subcutaneous fat tissue region and the pectoralis muscle region as the mammary glandular tissue region.
In another aspect, an apparatus for diagnosing a lesion is provided. The apparatus includes an image acquisition unit configured to photograph an inside of an organism to acquire an image, an lesion detecting unit including an extracting unit configured to extract at least one tissue region from the image, a setting unit configured to set at least one of the at least one extracted tissue region as a lesion detection candidate region and a detecting unit configured to detect a lesion from the lesion detection candidate region, and a lesion diagnosis unit configured to diagnose a name of disease based on the lesion detected by the lesion detecting unit.
The image may be an image of a breast, and the lesion detection candidate region is a mammary glandular tissue region.
In another aspect, a medical imaging device for detecting a lesion is provided. The device may include an extracting unit configured to set at least one tissue region, as a lesion detection candidate region, from an image of a plurality of tissue regions, and a detecting unit configured to detect a lesion from the lesion detection candidate region.
Other features and aspects may be apparent from the following detailed description, the drawings, and the claims.
Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features, and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.
The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, apparatuses, and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the systems, apparatuses and/or methods described herein will be suggested to those of ordinary skill in the art. Also, descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.
Hereafter, examples will be described with reference to accompanying drawings.
Referring to
The extracting unit 110 may extract at least one tissue region from an image. The image may capture the inside of an organism and the image may include a plurality of tissue regions.
For example, the extracting unit 110 may use a window having a size of N*N or N*M to extract feature information of the image. The extracting unit 110 may compare feature information of the image with feature information of the tissue regions to extract the tissue region from the image. As an example, the extraction unit 110 may use a machine learning to extract the tissue region from the image. The machine learning may include a Support Vector Machine (SVM), a Random Forest (RF) classification, etc. The feature information of the image and the feature information of the tissue region may represent brightness, color, texture, relative location, shape, or any combination thereof. The extracting unit 110 may use a gabor filter, a spatial gray level dependence (SGLD) filter or a wavelet filter to extract texture feature information.
As another example, the extracting unit 110 may perform image segmentation on the image to partition the image into a plurality of regions. The extracting unit 110 may extract feature information for each partitioned region. The extracting unit 110 may compare the feature information of the partitioned region with feature information of the tissue region to determine the tissue region of the image.
Hereinafter, the description will be further made assuming that the image is an image of breasts and that a user or a manufacturer determines a mammary glandular tissue region as a lesion detection object region.
As another example, the extracting unit 110 may compare feature information of the image of the breasts with feature information of the mammary glandular tissue region. Subsequently, the extracting unit 110 may extract only the mammary glandular tissue region from the image of the breasts.
As yet another example, the extracting unit 110 may compare feature information of the image of the breast with feature information of the breast tissue region. Subsequently, the extracting unit 110 may extract a breast tissue region from the image of the breast. The breast tissue region includes a subcutaneous fat tissue region, the mammary glandular tissue region and a pectoralis muscle region. The subcutaneous fat tissue region may have a darker brightness than the brightness of other tissue regions and have a round shape. The pectoralis muscle region may have a band-like texture with a uniform direction. The mammary glandular tissue region may have a spot-like texture and exist between the subcutaneous fat tissue region and the pectoralis muscle region. As described above, the extracting unit 110 may extract only the mammary glandular tissue region or extract all of the breast tissue regions from the image of the breast.
As yet another example, the extracting unit 110 may perform an image segmentation on the image of the breast to partition the image of the breasts into a plurality of regions. The extracting unit 110 may extract feature information from each partitioned region. The extracting unit 110 may compare feature information of the partitioned region with feature information of the mammary grandular tissue region to extract a region. As described above, the extracting unit 110 may extract only the mammary glandular tissue region or extract all of the breast tissue regions from the image.
As yet another example, the extracting unit 110 may extract the subcutaneous fat tissue region and the pectoralis muscle region from the image of the breast. The extracting unit 110 may designate as the mammary glandular tissue region a region between the subcutaneous fat tissue region and the pectoralis muscle region.
The extracting unit 110 may perform a morphology operation to remove noise included in an image related to the extracted tissue region.
The setting unit 120 may set as a lesion detection candidate region at least one of the at least one extracted tissue region. For example, a user may specify a region having a higher chance of having a lesion than other regions as the lesion detection candidate region. As another example, in response to a user or a manufacturer specifying a cerebellum region and a muscle region as the lesion detection candidate region, the setting unit 120 may set the cerebellum region and the muscle region as the lesion detection candidate region. The cerebellum region and the muscle region may be regions among the extracted tissue regions. In response to the image being an image of a breast, a user or a manufacture may specify a mammary glandular tissue region as the lesion detection candidate region. The mammary glandular tissue region may be a region in which lesions more frequently occur than in other tissue regions. In this example, the setting unit 120 sets the mammary glandular tissue region among the extracted tissue regions as the lesion detection candidate region.
The detecting unit 130 may detect a lesion from the lesion detection candidate region. For example, the detection region 130 may detect information about the location, size and shape of the lesion. The detecting unit 130 may use an image segmentation scheme to detect a lesion. The image segmentation scheme may include a binary histogram thresholding method (BHT), a super-pixel scheme, or the like.
As described above, the lesion detecting apparatus may extract tissue regions from an image, set at least one of the extracted tissue regions as a lesion detection candidate region and perform detection only on the lesion detection candidate region. Accordingly, the chance of detecting a lesion may be increased and the time taken to detect a lesion may be reduced.
Referring to
The image acquisition unit 210 may capture the inside of an organism as an image. The image may include a plurality of tissue regions. For example, the image acquisition unit 210 may be a ultrasonic imaging system, a Computed Tomography (CT) system, a Magnetic Resonance Imaging (MRI) system, or any other system capable of photographing the inside of an organism.
The lesion detecting unit 220 may detect a lesion from the image generated by the image acquisition unit 210. For example, the lesion detecting unit 220 may extract information from the image about a lesion. The extracted information may include the location, size and shape of the lesion. The lesion detecting unit 220 may be implemented using a similar corresponding structure as that of the lesion detecting apparatus 100 shown in
The lesion diagnosis unit 230 may diagnose the name of disease based on information from the lesion detected by the lesion detecting unit 220. For example, the lesion diagnosis unit 230 may diagnose the name of disease based on the information of the lesion, such as location, size and shape of the lesion.
As described above, the lesion diagnosis apparatus 200 may segment an image into a plurality of tissue regions, extract tissue regions from the image, set at least one of the extracted tissue regions as a lesion detection candidate region and perform detection only on the lesion detection candidate region. Accordingly, the lesion may be automatically detected and the name of disease may be diagnosed rapidly.
Hereinafter, a process of detecting a lesion from an image of the inner part of a breast produced by the lesion detecting apparatus will be described. However, the disclosure is not limited thereto, and the lesion detecting apparatus may detect lesions in other parts of a body in addition to the breast.
Referring to
Referring to
Referring to
Hereinafter, the description will be made on the assumption that the mammary glandular tissue region 320 is set as the lesion detection candidate region.
For example, the lesion detecting apparatus 100 may compare feature information of the image of the breast with feature information of the tissue regions of the breast to extract tissue regions. The extracted tissue regions may include the subcutaneous fat tissue region 311, the retromammary fat region 312, the mammary glandular tissue region 320 and the pectoralis muscle region 330. The lesion detecting apparatus 100 may perform a morphology operation to reduce noise in an image corresponding to the extracted tissue region. The lesion detecting apparatus 100 may set the mammary glandular tissue region 320 as the lesion detection candidate region. The mammary glandular tissue region 320 may be one of the tissue regions among the extracted tissue regions 311, 312, 320 and 330.
For example, the lesion detecting apparatus 100 may compare feature information of the image of the breast with feature information of the mammary glandular tissue region 320 to extract the mammary glandular tissue region 320 from the image of the breast. Subsequently, the lesion detecting apparatus 100 may set the mammary glandular tissue region 320 as the lesion detection candidate region.
Referring to
As described above, the lesion detecting apparatus detects a lesion from the lesion detection candidate region having a higher chance of having lesions than other tissue regions. Accordingly, there is little need to perform lesion detection on the other tissue regions having a lower chance of having lesions. Accordingly, the time used to detect lesions is reduced and the lesions are more precisely detected.
Hereinafter, a process of detecting a lesion from an image of the inner part of a breast taken by the image acquisition unit will be described. However, the disclosure is not limited thereto, and the lesion detecting apparatus may detect lesions of other parts of a body in addition to the breast.
Referring to
Referring to
Referring to
Referring to
Referring to
As described above, the lesion detecting apparatus may partition the image of the breast into a plurality of regions and extract tissue regions of the image of the breast based on the feature information of each partitioned region, thereby improving the precision of extracting the tissue regions.
In addition, the lesion detecting apparatus may not need to perform lesion detection on tissue regions having a lower chance of having lesions, so that the time taken to detect lesions is reduced and the lesions are more precisely detected.
Hereinafter, a process of detecting a lesion from an image of the inner part of a breast taken by the image acquisition unit will be described. However, the disclosure is not limited thereto, and the lesion detecting apparatus may detect lesions of other parts of a body in addition to the breast.
Referring to
Referring to
Referring to
Referring to
Referring to
As described above, the lesion detecting apparatus may detect a lesion from the lesion detection candidate region having a higher chance of having lesions than other tissue regions, so there is no need for the lesion detecting apparatus to perform lesion detection on the other tissue regions having a lower chance of having lesions. Accordingly, the time used to detect lesions may be reduced and the lesions may be more precisely detected.
Referring to
For example, the lesion detecting apparatus extracts tissue regions from an image by comparing feature information of the image with feature information of the tissue region.
The lesion detecting apparatus may partition an image into a plurality of regions by performing an image segmentation on the image, compare feature information of the partitioned region with feature information of the plurality of tissue regions, thereby extracting the tissue regions.
In response to the image being of a breast, the lesion detecting apparatus may extract at least breast tissue region from the image of the breast. For example, the lesion detecting apparatus may compare feature information of the image of the breast with feature information of the breast tissue region to extract at least one breast tissue region from the image of the breast.
For example, the lesion detecting apparatus may perform an image segmentation on the image of the breast to partition the image of the breast into a plurality of regions and compare feature information of the partitioned plurality of regions with feature information of the mammary glandular tissue region to extract the mammary glandular tissue region.
For example, the lesion detecting apparatus may detect the subcutaneous fat tissue region and the pectoralis muscle region from the image of the breasts and extract a region between the subcutaneous fat tissue region and the pectoralis muscle region as the mammary glandular tissue region.
The lesion detecting apparatus may set at least one of the extracted tissue regions as a lesion detection candidate region (610). For example, in response to the image being an image of breasts, the lesion detecting apparatus may set the mammary glandular tissue region among the breast tissue regions as the lesion detection candidate region.
The lesion detecting apparatus may detect lesions from the lesion detection candidate region (620).
According to this example of the method for detecting lesion, at least one tissue region is extracted from the image, some of the extracted tissue regions may be set as a lesion detection candidate region, and the lesion detection is performed only on the lesion detection candidate region. Accordingly, the chance of detecting a lesion is improved and the time used to detect the lesion is reduced.
Program instructions to perform a method described herein, or one or more operations thereof, may be recorded, stored, or fixed in one or more computer-readable storage media. The program instructions may be implemented by a computer. For example, the computer may cause a processor to execute the program instructions. The media may include, alone or in combination with the program instructions, data files, data structures, and the like. Examples of computer-readable media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like. Examples of program instructions include machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter. The program instructions, that is, software, may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion. For example, the software and data may be stored by one or more computer readable recording mediums. Also, functional programs, codes, and code segments for accomplishing the example embodiments disclosed herein can be easily construed by programmers skilled in the art to which the embodiments pertain based on and using the flow diagrams and block diagrams of the figures and their corresponding descriptions as provided herein. Also, the described unit to perform an operation or a method may be hardware, software, or some combination of hardware and software. For example, the unit may be a software package running on a computer or the computer on which that software is running.
A number of examples have been described above. Nevertheless, it will be understood that various modifications may be made. For example, suitable results may be achieved if the described techniques are performed in a different order and/or if components in a described system, architecture, device, or circuit are combined in a different manner and/or replaced or supplemented by other components or their equivalents. Accordingly, other implementations are within the scope of the following claims.
Seong, Yeong-Kyeong, Park, Moon-Ho
Patent | Priority | Assignee | Title |
9959617, | Jan 28 2016 | TAIHAO MEDICAL INC. | Medical image processing apparatus and breast image processing method thereof |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 19 2012 | SEONG, YEONG-KYEONG | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027630 | /0534 | |
Jan 19 2012 | PARK, MOON-HO | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027630 | /0534 | |
Feb 01 2012 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 28 2017 | ASPN: Payor Number Assigned. |
Nov 30 2020 | REM: Maintenance Fee Reminder Mailed. |
May 17 2021 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Apr 11 2020 | 4 years fee payment window open |
Oct 11 2020 | 6 months grace period start (w surcharge) |
Apr 11 2021 | patent expiry (for year 4) |
Apr 11 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 11 2024 | 8 years fee payment window open |
Oct 11 2024 | 6 months grace period start (w surcharge) |
Apr 11 2025 | patent expiry (for year 8) |
Apr 11 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 11 2028 | 12 years fee payment window open |
Oct 11 2028 | 6 months grace period start (w surcharge) |
Apr 11 2029 | patent expiry (for year 12) |
Apr 11 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |