Although methods employing chain codes or Fourier descriptors are known for calculating an outline similarity between a model image and an object image, these methods are difficult to achieve detection of both an approximate similarity and local similarity. In view of this, according to the present invention, wavelet transformation is performed on outline points of an object image, and similarity calculation is performed on a plurality of model images with the use of a low frequency component of the transformation result. Only the model image, having a close agreement in matching processing, is subjected to similarity calculation using a high frequency component of the transformation result. By virtue of this processing, similarity calculation can be performed at high speed with high precision.
|
8. An image processing apparatus comprising:
object extraction means for extracting an object image from image data; outline point extraction means for extracting a predetermined number of outline points from an outline of the object image; wavelet transformation means for performing wavelet transformation on the outline points; and similarity calculation means for calculating a similarity between the object image and a predetermined model image based on a wavelet transformation result, wherein said similarity calculation means comprises: first calculation means for calculating a similarity between the object image and the model image based on a component corresponding to a low frequency component of the wavelet transformation result; and second calculation means for calculating a similarity between the object image and the model image based on a component corresponding to a high frequency component of the wavelet transformation result, and wherein, in said first and second calculation means, the similarity is calculated by integrating a difference in outline points between the object image and the model image. 1. An image processing method comprising:
an object extraction step of extracting an object image from image data; an outline point extraction step of extracting a predetermined number of outline points from an outline of the object image; a wavelet transformation step of performing wavelet transformation on the outline points; and a similarity calculation step of calculating a similarity between the object image and a predetermined model image based on a wavelet transformation result, wherein said similarity calculation step further comprises: a first calculation step of calculating a similarity between the object image and the model image based on a component corresponding to a low frequency component of the wavelet transformation result; and a second calculation step of calculating a similarity between the object image and the model image based on a component corresponding to a high frequency component of the wavelet transformation result, and wherein, in said first and second calculation steps, the similarity is calculated by integrating a difference in outline points between the object image and the model image. 12. A computer program product comprising a computer readable medium having computer program code, for determining a similarity of images, said product comprising:
code for an object extraction step of extracting an object image from image data; code for an outline point extraction step of extracting a predetermined number of outline points from an outline of the object image; code for a wavelet transformation step of performing wavelet transformation on the outline points; and code for a similarity calculation step of calculating a similarity between the object image and a predetermined model image based on a wavelet transformation result, wherein said code for a similarity calculation step comprises: code for a first calculation step of calculating a similarity between the object image and the model image based on a component corresponding to a low frequency component of the wavelet transformation result; and code for a second calculation step of calculating a similarity between the object image and the model image based on a component corresponding to a high frequency component of the wavelet transformation result, and wherein, in the first and second calculation steps, the similarity is calculated by integrating a difference in outline points between the object image and the model image. 2. The method according to
wherein in said wavelet transformation step, wavelet transformation is performed on outline points expressed by polar coordinates.
3. The method according to
4. The method according to
5. The method according to
6. The method according to
7. The method according to
a color similarity calculation step of calculating a color similarity between the object image and the model image; a texture similarity calculation step of calculating a texture similarity between the object image and the model image; and an integrated similarity calculation step of calculating an integrated similarity between the object image and the model image by adding a weight to the similarities which are respectively calculated in said color similarity calculation step, said texture similarity calculation step, and said similarity calculation step.
9. The apparatus according to
10. The apparatus according to
11. The apparatus according to
color similarity calculation means for calculating a color similarity between the object image and the model image; texture similarity calculation means for calculating a texture similarity between the object image and the model image; and integrated similarity calculation means for calculating an integrated similarity between the object image and the model image by adding a weight to the similarities which are respectively calculated by said color similarity calculation means, said texture similarity calculation means, and said similarity calculation means.
|
The present invention relates to an image processing method and apparatus for calculating an outline similarity between an object image and a model image.
As a conventional method of calculating an outline similarity between a silhouette image of an object of an original image and a silhouette image of a model image, methods employing chain codes or Fourier descriptors are known.
According to the outline similarity calculation method employing chain codes, outline or line segment components are followed while quantizing the direction of the outline or line segment components, and the quantized values are recorded as a code. For instance, assuming a case of quantizing an outline in eight directions, a string of numerals including 0 to 7 are obtained as a code. Then, the difference is calculated between the obtained code of an object and that of a model object (hereinafter referred to as an object), thereby determining the similarity.
According to the outline similarity calculation method employing Fourier descriptors, a periodic function representing a curve of the outline is obtained, and Fourier series expansion is performed to obtain coefficients of the Fourier series, which represent characteristics of a closed curve. Then, the difference is calculated between the string of coefficients of the object and that of the model object, thereby determining the similarity.
However, according to the foregoing conventional method employing chain codes, since the similarity is determined based only on the difference of outline directions, all differences are detected even if the difference in the outline shape is quite small. Therefore, not only a long processing time is required, but also it is difficult to determine the similarity of roughly similar images.
Furthermore, according to the foregoing conventional method employing Fourier descriptors, although an approximate similarity can be calculated, it is difficult to determine the similarity of local portions, e.g., presence of corners or the like.
The present invention has been proposed to solve the conventional problems, and has as its object to provide an image processing method and apparatus capable of similarity calculation between a model image and an object image at high speed with high precision.
According to the present invention, the foregoing object is attained by providing an image processing method comprising: an object extraction step of extracting an object image from image data; an outline point extraction step of extracting a predetermined number of outline points from an outline of the object image; a wavelet transformation step of performing wavelet transformation on the outline points; and a similarity calculation step of calculating a similarity between the object image and a predetermined model image based on a wavelet transformation result.
The invention is particularly advantageous since similarity calculation between a model image and an object image can be executed at high speed with high precision.
Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings.
Functional Configuration
Reference numeral 23 denotes an object image storage, where objects are extracted from a sample image and silhouette images of the extracted objects (hereinafter simply referred to as an object image) are stored. Reference numeral 24 denotes an outline point extraction portion where outline point coordinates are extracted from the object image and wavelet transformation is performed.
Reference numeral 25 denotes a similarity calculation portion for calculating the similarity of an object image to each of the plurality of model images registered in the database 22. The calculated results are displayed on a display portion 26. As a display method, for instance, a plurality of model images may be displayed in order of those having smallest differences with the object image, i.e., in order of higher similarity.
Brief Description of Similarity Calculation Processing
In step S1, a silhouette image of an object is extracted from, for instance, a sample grayscale image, and the extracted image is stored in the object image storage 23 as an object image.
Next in step S2, an outline of the object image, obtained in step S1, is extracted. In step S3, the outline of the object image obtained in step S2 is equally divided by N to obtain outline points. In step S4, the obtained outline points are subjected to polar coordinate transformation with the barycenter of the object image as the center of the coordinate system. In step S5, the outline represented by the outline points is transformed into a wavelet descriptor. The above-described steps S2 to S5 are executed by the outline point extraction portion 24.
In step 6, the similarity of the outline is calculated between the object image and a model image by using a component corresponding to the low frequency component of the outline. The similarity calculation is, in other words, a matching processing between the object image and model image. With regard to the model image having a close agreement with the object image in the matching processing in step S6, matching processing with higher precision is performed in step S7 by using a component corresponding to the high frequency component of the outline. The matching processing in steps S6 and S7 is executed by the similarity calculation portion 25.
In step S8, the matching result obtained in steps S6 and S7, i.e., the similarity, is displayed on the display portion 26 to inform the user.
Hereinafter, each of the processing shown in
Object Extraction Processing
The aforementioned object image extraction processing in step S1 is described with reference to
Sixteen images shown in
In the first embodiment, in each of the sixteen labeling images, linking labeling portions are detected and an independent object is extracted. Then the pixels which are included in the extracted object but not included in the labeling portions are filled in. By this, extraction of the silhouette image of an object (object image) is realized.
Note that since a known method is employed for the outline extraction processing in step S2, detailed description is omitted.
Outline Division Processing and Polar Coordinate Transformation Processing
Hereinafter, a preparatory processing in steps S3 and S4 for the aforementioned wavelet transformation is described.
First, the processing in step S3 for obtaining outline points of the object image is described with reference to
In step S4 in
In the first embodiment, wavelet transformation which will be described below is performed only on r(n) obtained by the polar coordinate transformation represented by Equation 1.
Wavelet Transformation Processing
Hereinafter, the aforementioned wavelet transformation processing in step S5 is described with reference to
Note that the filter coefficient shown in
As described above, by transforming r(n) by the low-pass filter H0, a wave H0 is obtained, and by transforming r(n) by the high-pass filter H1, a wave H1 is obtained. In a similar manner, by further performing transformation by the low-pass filter H0 and high-pass filter H1, waves H00 and H01 are obtained. The plural levels of wavelet transformation results with respect to r(n) are stored as ri(n).
Note that similar wavelet transformation is performed also on a plurality (n) of model images in advance, and the transformation results are stored as rj(n) in the database 22.
Hereinafter, a similarity determination method using wavelet transformation according to the first embodiment is described in detail.
As can be seen from
Note in
Matching Processing
Hereinafter, the aforementioned matching processing in steps S6 and S7 is described with reference to FIG. 7. Steps S6 and S7 in
The first embodiment is characterized by performing matching processing while taking into consideration an influence of the start point of outline following processing. If the start point of outline following processing differs in the model image and object image, a Sim value, a similarity index to be described later, also differs. Therefore, according to the first embodiment, the start point of outline following processing in the model image is shifted point by point, thereby obtaining a plurality of Sim values for an object image. Among the plurality of Sim values obtained, the smallest value is the similarity between the model image and object image.
In Equation 3, ri(n) and rj(n) respectively indicate the outline point values of an object image and a model image, on which wavelet transformation has been performed. As mentioned above, plural levels of wavelet transformation results are obtained by performing wavelet transformation on the outline points. It is to be noted that ri(n) and rj(n) subjected to comparison are in the same level.
Next in step S603, the Sim value obtained in step S602 is compared with the value stored in the minimum value register (not shown) in order to determine whether or not the Sim value of interest is the smallest value so far in the outline following processing. If the Sim value of interest is the smallest value, the value in the minimum value register is replaced by this Sim value in step S604. Then, in step S605, when it is determined that the outline-point shifting is completed for the entire circumference of the outline, the processing ends. In other words, the Sim value, stored ultimately in the minimum value register, is the similarity index for the model image.
By performing the matching processing, shown in the flowchart of
In step S6 in
If similarity determination is desired with higher precision, matching processing with higher precision is performed in step S7 in
As has been described above, according to the first embodiment, since the similarity calculation between a model image and an object image is performed based on the wavelet descriptor representing an outline, it is possible to reduce the number of reference points, thus enabling high-speed processing.
Furthermore, since an approximate similarity or local similarity can be detected in accordance with the level of wavelet transformation, similarity calculation that meets the user's needs can be performed.
Furthermore, by virtue of the similarity calculation method of the first embodiment, high-speed image retrieval is possible. More specifically, an image desired by a user may be inputted as an object image, then similarity calculation is performed between the object image and a plurality of model images stored in the database, and a model image having a highest similarity value or having a predetermined value or larger may be outputted as a retrieval result.
Hereinafter, a second embodiment of the present invention is described.
The foregoing first embodiment has described an example of calculating the similarity with respect to an outline shape of a model image and an object image. In the second embodiment, similarities in color and texture are also taken into account.
Reference numeral 81 denotes a color similarity calculation portion including database, storing average color value data of the model images which are commonly used by the outline similarity calculation portion 80. The color similarity calculation portion 81 calculates a color similarity based on a difference between the average color value of an object image and that of a model image.
Reference numeral 82 denotes a texture similarity calculation portion including database, storing spatial frequency distribution data of the model images which are commonly used by the outline similarity calculation portion 80. The texture similarity calculation portion 82 calculates a texture similarity based on a difference between the spatial frequency distribution of an object image and that of a model image.
Reference numeral 83 denotes a weight evaluation portion, where the aforementioned three similarities and a weight coefficient 84 which is set by a controller (not shown), are inputted. The weight coefficient 84 indicates which of the three similarities is to be emphasized. In accordance with the weight coefficient 84, a weight is added to the aforementioned three similarities, and the result is outputted as an integrated similarity 86.
As described above, according to the second embodiment, since similarity calculation is performed with respect to a color and texture in addition to an outline shape, the similarity can be determined with higher precision than the first embodiment.
The present invention can be applied to a system constituted by a plurality of devices (e.g., host computer, interface, reader, printer) or to an apparatus comprising a single device (e.g., copying machine, facsimile machine).
Further, the object of the present invention can also be achieved by providing a storage medium (or recording medium) recording program codes for performing the aforesaid processes to a computer system or apparatus, reading the program codes, by a CPU or MPU of the computer system or apparatus, from the storage medium, then executing the program. In this case, the program codes read from the storage medium realize the functions according to the embodiments, and the storage medium storing the program codes constitutes the invention. Furthermore, besides aforesaid functions according to the above embodiments are realized by executing the program codes which are read by a computer, the present invention includes a case where an OS (operating system) or the like working on the computer performs a part or the entire processes in accordance with designations of the program codes and realizes functions according to the above embodiments.
Furthermore, the present invention also includes a case where, after the program codes read from the storage medium are written in a function expansion card which is inserted into the computer or in a memory provided in a function expansion unit which is connected to the computer, CPU or the like contained in the function expansion card or unit performs a part or the entire process in accordance with designations of the program codes and realizes functions of the above embodiments.
In a case where the present invention is applied to the aforesaid storage medium, the storage medium stores program codes corresponding to the flowcharts (FIG. 2 and/or
As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the claims.
Fukuda, Yasuo, Osawa, Hidefumi
Patent | Priority | Assignee | Title |
10628890, | Feb 23 2017 | International Business Machines Corporation | Visual analytics based vehicle insurance anti-fraud detection |
7130466, | Dec 21 2000 | KYNDRYL, INC | System and method for compiling images from a database and comparing the compiled images with known images |
7146305, | Oct 24 2000 | KYNDRYL, INC | Analytical virtual machine |
7162649, | Jun 30 2000 | International Business Machines Corporation | Method and apparatus for network assessment and authentication |
7178166, | Sep 19 2000 | KYNDRYL, INC | Vulnerability assessment and authentication of a computer by a local scanner |
7237264, | Jun 04 2001 | KYNDRYL, INC | System and method for preventing network misuse |
7340776, | Jan 31 2001 | FINJAN BLUE, INC | Method and system for configuring and scheduling security audits of a computer network |
7370360, | May 13 2002 | PALO ALTO NETWORKS, INC | Computer immune system and method for detecting unwanted code in a P-code or partially compiled native-code program executing within a virtual machine |
7499590, | Dec 21 2000 | KYNDRYL, INC | System and method for compiling images from a database and comparing the compiled images with known images |
7565549, | Jan 04 2002 | TAASERA LICENSING LLC | System and method for the managed security control of processes on a computer system |
7574740, | Apr 28 2000 | International Business Machines Corporation | Method and system for intrusion detection in a computer network |
7634800, | Jun 30 2000 | International Business Machines Corporation | Method and apparatus for network assessment and authentication |
7657419, | Jun 19 2001 | KYNDRYL, INC | Analytical virtual machine |
7657938, | Oct 28 2003 | KYNDRYL, INC | Method and system for protecting computer networks by altering unwanted network data traffic |
7673137, | Jan 04 2002 | TAASERA LICENSING LLC | System and method for the managed security control of processes on a computer system |
7712138, | Jan 31 2001 | International Business Machines Corporation | Method and system for configuring and scheduling security audits of a computer network |
7770225, | Jul 29 1999 | FINJAN BLUE, INC | Method and apparatus for auditing network security |
7913303, | Jan 21 2003 | Alibaba Group Holding Limited | Method and system for dynamically protecting a computer system from attack |
7921459, | Apr 28 2000 | International Business Machines Corporation | System and method for managing security events on a network |
7934254, | Nov 23 1999 | International Business Machines Corporation | Method and apparatus for providing network and computer system security |
8006243, | Dec 07 1999 | International Business Machines Corporation | Method and apparatus for remote installation of network drivers and software |
8107739, | Dec 21 2000 | KYNDRYL, INC | System and method for compiling images from a database and comparing the compiled images with known images |
8396597, | Aug 18 2009 | Deere & Company | Distributed robotic guidance |
9027121, | Oct 10 2000 | KYNDRYL, INC | Method and system for creating a record for one or more computer security incidents |
Patent | Priority | Assignee | Title |
6381370, | Jul 14 1997 | Oki Electric Industry Co., Ltd. | Method and apparatus for image encoding |
6532307, | Jul 28 1998 | Canon Kabushiki Kaisha | Image retrieval by comparing wavelet encoded database images with input shapes |
20020178135, | |||
20030044073, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Apr 20 2000 | FUKUDA, YASUO | Canon Kabushiki Kaisha | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010786 | 0823 | |
Apr 21 2000 | OSAWA, HIDEFUMI | Canon Kabushiki Kaisha | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010786 | 0823 | |
May 03 2000 | Canon Kabushiki Kaisha | (assignment on the face of the patent) |
Date | Maintenance Fee Events |
Feb 03 2005 | ASPN: Payor Number Assigned. |
Feb 09 2007 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Apr 18 2011 | REM: Maintenance Fee Reminder Mailed. |
Sep 09 2011 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Sep 09 2006 | 4 years fee payment window open |
Mar 09 2007 | 6 months grace period start (w surcharge) |
Sep 09 2007 | patent expiry (for year 4) |
Sep 09 2009 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 09 2010 | 8 years fee payment window open |
Mar 09 2011 | 6 months grace period start (w surcharge) |
Sep 09 2011 | patent expiry (for year 8) |
Sep 09 2013 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 09 2014 | 12 years fee payment window open |
Mar 09 2015 | 6 months grace period start (w surcharge) |
Sep 09 2015 | patent expiry (for year 12) |
Sep 09 2017 | 2 years to revive unintentionally abandoned end. (for year 12) |