An average image producing means 52 produces an average image from all or some of a plurality of images captured at the same location. A noise extracting means 53 extracts a noise pixel on the basis of the result of a comparison between the pixel values of the pixels in the captured images and the pixel values of the pixels at the same position in the average image. An interpolating means 54 interpolates the pixel value of the noise pixel included in the captured images using the pixel values of other pixels to produce a noise-eliminated image.
|
8. An image processing method comprising:
producing an average image from all or some of a plurality of images captured at the same location;
extracting a noise pixel on the basis of the result of a comparison between pixel values of pixels in the captured images and pixel values of pixels at the same position in the average image; and
interpolating the pixel value of the noise pixel included in the captured images using the pixel values of other pixels to produce a noise-eliminated image.
1. An image processing device comprising:
average image producing means for producing an average image from all or some of a plurality of images captured at the same location;
noise extracting means for extracting a noise pixel on the basis of the result of a comparison between pixel values of pixels in the captured images and pixel values of pixels at the same position in the average image; and
interpolating means for interpolating the pixel value of the noise pixel included in the captured images using the pixel values of other pixels to produce a noise-eliminated image.
9. An image processing program stored on a non-transitory medium executed by a computer comprising:
an average image producing process for producing an average image from all or some of a plurality of images captured at the same location;
a noise extracting process for extracting a noise pixel on the basis of the result of a comparison between pixel values of pixels in the captured images and pixel values of pixels at the same position in the average image; and
an interpolating process for interpolating the pixel value of the noise pixel included in the captured images using the pixel values of other pixels to produce a noise-eliminated image.
2. An image processing device according to
wherein the average image producing means produces the average image from the aligned captured images;
the noise extracting means extracts the noise pixel on the basis of the result of a comparison between the pixel values of the pixels in the aligned captured images and the pixel values of the pixels at the same position in the average image that is produced from the aligned captured images; and
the interpolating means interpolates the pixel value of the extracted noise pixel using the pixel values of pixels adjacent to the extracted noise pixel to produce a noise-eliminated image.
3. An image processing device according to
4. An image processing device according to
5. An image processing device according to
6. An image processing device according to
7. An image processing device according to
10. A non-transitory computer-readable recording medium in which the image processing program of
|
The present invention relates to an image processing device for processing an image obtained by photographing the form of an object such as an eye to be examined, an image processing method, an image processing program, and a recording medium storing the program.
Various methods for eliminating noise from images obtained by an imaging device such as a CCD have been proposed. For example, Patent Document 1 discloses a method in which, on the basis of the pixel value of a selected pixel and the pixel value of a pixel of interest, a new pixel value of the pixel of interest is determined, and the pixel value of the pixel of interest is replaced with the new pixel value to generate new image data, thereby eliminating noise from image data including low-frequency noise.
Patent Document 2 discloses a method in which a substantially simultaneously acquired image in a full pixel read-out mode and an image in a pixel summation read-out mode are separated respectively into luminance components and color difference components for adaptive synthetic processing.
Patent Document 3 discloses a method in which, for a pixel to be corrected, the highest value is calculated from among pixel values of a plurality of surrounding same-color pixels having the same color component as the pixel to be corrected, and among pixel values of a plurality of surrounding different-color pixels having a color component different from that of the pixel to be corrected, the plurality of surrounding different-color pixels being closer to the pixel to be corrected than are the plurality of surrounding same-color pixels, and, when the pixel value of the pixel to be corrected is higher than the calculated highest value, the pixel value of the pixel to be corrected is replaced with the calculated highest value to correct a white defect.
Patent Document 4 discloses an image processing device for eliminating noise from a moving image, the device comprising: a contrast calculation unit for calculating a contrast value for a target pixel in a basis image; a motion vector calculation unit for calculating a motion vector between the basis image and the reference image, the motion vector calculation unit using the contrast value to modify the method for calculating the motion vector; a motion compensation unit for compensating for motion of the reference image with respect to the basis image on the basis of the motion vector calculated by the motion vector calculation unit; and a weighted addition unit for performing weighted addition of the basis image and the reference image subjected to motion compensation for each target pixel.
Patent Document 5 discloses an image processing device in which an inputted image and a reference image are added to eliminate noise in the inputted image.
Patent Document 6 discloses an image processing method for eliminating noise included in an image, the method comprising extracting a high-frequency component from an image, extracting a noise component from the extracted high-frequency component using non-linear conversion, subtracting the extracted noise component from the image, again extracting the high-frequency component from the image from which the noise component was subtracted, extracting a correction component using non-linear conversion from the high-frequency component that was extracted again, and adding the extracted correction component to the image from which the noise component was subtracted.
Patent Document 1: Japanese Patent No. 3862613
Patent Document 2: Japanese Patent Laid-open Publication No. 2008-131580
Patent Document 3: Japanese Patent Laid-open Publication No. 2011-135566
Patent Document 4: Japanese Patent Laid-open Publication No. 2012-222510
Patent Document 5: Domestic Re-publication of PCT International Application No. 2010-007777
Patent Document 6: Japanese Patent No. 4535125
The prior art described above presents a problem in that, if the signal and noise characteristics are not clearly separated, the effect of noise elimination is diminished and the signal intensity may be reduced. Therefore, it is an object of the present invention to provide an image processing device for eliminating noise from an image while maintaining signal intensity, an image processing method, an image processing program, and a recording medium storing the program.
An image processing device of the present invention that solves the problems described above, comprises:
average image producing means for producing an average image from all or some of a plurality of images captured at the same location;
noise extracting means for extracting a noise pixel on the basis of the result of a comparison between the pixel values of the pixels in the captured images and the pixel values of the corresponding pixels in the average image; and
interpolating means for interpolating the pixel value of the noise pixel included in the captured images using the pixel values of other pixels to produce a noise-eliminated image.
According to the present invention, a noise pixel is extracted on the basis of the difference in pixel values between a captured image and an average image. It is therefore possible to eliminate noise from an image while maintaining signal intensity.
An image processing device according to the present invention will be described in detail below on the basis of embodiments and with reference to the attached drawings. Description will be given of an example in which a tomographic image (one example of a captured image) of the fundus of an eye to be examined is acquired by an ophthalmologic examination apparatus and noise is eliminated from the tomographic image; however, the present invention can also be applied to cases in which other objects are captured using other types of apparatuses.
The illumination optical system 4 includes an observation light source such as a halogen lamp and a photographing light source such as a xenon lamp. The light from these light sources is guided to the fundus Ef via the illumination optical system 4 to illuminate the fundus. The photographic optical system 5 includes optical elements such as an objective lens and a photographic lens, and an imaging device such as a CCD. The photographic optical system 5 guides photographing light reflected by the fundus Ef along a photographing optical path to the imaging device to capture an image of the fundus Ef. The photographic optical system 5 also guides below described signal light from the OCT unit 2 to the fundus Ef and light reflected therefrom to the OCT unit 2. The scan unit 6 includes mechanisms such as galvanometer mirrors for scanning the signal light from the OCT unit 2 in the X direction and Y direction as shown in
The fundus camera unit 1 is optically connected via a connector 7 and a connecting wire 8 to the OCT unit 2 for capturing a tomographic image of the fundus Ef.
The OCT unit 2 may be not only of a Fourier domain type, but also of a time domain or a swept-source type; however, the OCT unit 2 will use a well-known Fourier domain type. In this case, a low coherence light source 20 emits light having a wavelength of 700-1100 nm. The light from the low coherence light source 20 is divided into reference light and signal light, and the reference light advances on a reference optical path and is reflected by a reference mirror. On the other hand, the signal light is guided to the fundus camera unit 1 via the connecting wire 8 and the connector 7, and is scanned on the fundus Ef in the X and Y directions by the scan unit 6. The signal light reflected by the fundus Ef and returned to the OCT unit 2 is superimposed on the reference light reflected by the reference mirror to produce interference light. The interference light is analyzed in spectrum in an OCT signal detection device 21 to generate an OCT signal that indicates information about the depth direction (Z direction) of the fundus.
An image processing device 3 is configured from, e.g., a microcomputer built in the fundus camera unit 1, or a personal computer connected to the fundus camera unit 1. The image processing device 3 is provided with a control unit 30 configured from CPU, RAM, ROM, and the like. The control unit 30 controls all image processing by executing an image processing program.
A display unit 31 is configured from, e.g., a display device such as an LCD, and displays an image produced or processed by the image processing device 3 and ancillary information such as information relating to a subject.
An operation unit 32 has, e.g., a mouse, keyboard, operation panel and the like, and is used by an operator to give commands to the image processing device 3.
A tomographic image forming unit 41 is implemented by a dedicated electronic circuit for executing a well-known analysis method such as a Fourier domain method (spectral domain method) or by the image processing program executed by the CPU described above, and forms a tomographic image of the fundus Ef on the basis of the OCT signal detected by the OCT signal detection device 21. The tomographic image formed by the tomographic image forming unit 41 is stored in a memory unit 42 configured from, e.g., a semiconductor memory, hard disk device, or the like. The memory unit 42 also stores the image processing program described above.
An image processing unit 50 performs a computation process on the tomographic image (captured image) formed by the tomographic image forming unit 41 and eliminates noise included in the tomographic image. The image processing unit 50 is configured from aligning means 51 for aligning other captured images with a reference image, average image producing means 52 for producing an average image from all or some of the captured images, noise extracting means 53 for extracting a noise pixel on the basis of the result of a comparison between the pixel values of the pixels in the captured images and the pixel values of the corresponding pixels in the average image, and interpolating means 54 for interpolating the pixel value of the noise pixel with the pixel values of other pixels to produce a noise-eliminated image. The means or image processes in the image processing unit 50 are implemented by use of the dedicated electronic circuit, or by executing the image processing program by the control unit 30.
The operation of the image processing device 3 will be described next with reference to the flowchart in
The aligning means 51 performs a process in which each of the tomographic images Ti is aligned with a reference image (step S2). Specifically, a reference image serving as an alignment reference is first selected or produced. The reference image may be any tomographic image, e.g., the first tomographic image T1, or Ti that is displayed on the display unit 31 and selected by an operator. Alternatively, an average image of the tomographic images Ti, the tomographic image most similar to this average image, or an average image of a plurality of tomographic images selected by an operator may be used as the reference image.
Next, as shown in
In Formula 1, T(k) represents a set of pixel values (the number of pixels n), and T (with a horizontal line above) represents the average of the pixel values.
Alignment can be performed by a variety of methods other than the method described above. For example, a method may be used in which alignment is performed on the entirety of the tomographic images without dividing the tomographic images into strip regions, or a method may be used in which parts of the tomographic images are extracted as characteristic, regions, and alignment is performed on the basis of the degree of similarity of these characteristic regions. Depending on the properties of the photographic object, the aligning process may be omitted.
The average image producing means 52 adds together the pixel values of the aligned tomographic images P1-PN for each of the pixels, and divides the resulting sum by the number of images N. This determines the pixel value of each of the pixels and produces an averaged image TA (step S3). The averaged image may be produced from some of the tomographic images Pi, rather than being produced from all of the aligned tomographic images Pi as described above. Alternatively, the pixel values of the averaged image may be determined using the median value or the most frequently appearing value of the pixel values of the tomographic images, rather than using the arithmetic mean of the pixel values.
The noise extracting means 53 calculates the difference between the pixel value of each of the pixels in the aligned tomographic image Pi and the pixel value of the pixel in the averaged image TA that is located at the same position as the pixel in the aligned tomographic image Pi, and, when the absolute value of this difference is greater than a prescribed threshold, determines that the pixel is a noise pixel (step S4). The threshold may be incrementally varied in accordance with the pixel values of the averaged image TA, rather than using a single value.
The comparison between the pixel value of each of the pixels in the tomographic image Ti and the pixel value of the pixel in the averaged image TA that is located at the same position as the pixel in the tomographic image Ti may be performed using a ratio, and a pixel may be determined to be a noise pixel when the ratio deviates from a prescribed range.
In the case of tomographic images by OCT as in the present embodiment, noise pixels may include pixels having low pixel values due to the structure of the fundus, as indicated by the black circles (reference symbol 61a) in
The interpolating means 54 deletes the pixels determined in step S4 to include noise as shown by reference symbol 62 in the upper section of
The tomographic images Q1-QN stored in the memory unit 42 from which noise was eliminated are displayed on the display unit 31 either one at a time or with a plurality of images lined up in a single screen image by the control unit 30 on the basis of commands given by an operator via the operation unit 32.
According to the image processing device 3 of the present embodiment, noise pixels are extracted on the basis of the result of a comparison between the pixel values of each of the aligned tomographic images Pi and the averaged image TA, and an interpolating process is performed for the noise pixels. It is therefore possible to eliminate noise while maintaining signal intensity, even when the signal intensity of the tomographic images Ti is low.
Because extraction of noise pixels and interpolation of noise pixels are each performed for all of the tomographic images Ti, a plurality of (N) distinct tomographic images from which noise was eliminated can be obtained.
A second embodiment of the present invention will be described next. The configuration of the device in the second embodiment is the same as that in the first embodiment shown in
The tomographic image forming unit 41 forms N tomographic images R1-RN of the same location on the fundus Ef of an eye E to be examined, and the aligning means 51 aligns each of the tomographic images Ri. These processes are the same as in the first embodiment (steps S1 and S2). Depending on the photographed object, the aligning process can be omitted, the same as in the first embodiment. Noise pixels are included in the produced tomographic images Ri, as indicated by the black circles (reference numeral 63) in
The average image producing means 52 uses tomographic images having a difference in photographing time less than a prescribed threshold relative to the tomographic images Ri to produce a different individual average image Bi for the tomographic images Ri (step S13). Specifically, for example, for the constant time required for a single scan in the X direction, the individual average image Bi for the tomographic images Ri is produced using the same method as in step S3 in
When the number of tomographic images preceding or following a tomographic image Ri is less than M as with R1, R2, RN-1 and RN in
The noise extracting means 53 performs a subtraction process on the pixel values in each of the tomographic images Ri and the individual average images Bi, and extracts noise pixels using the same method as in step S4 in
The interpolating means 54 interpolates the noise pixels, produces tomographic images U1-UN from which noise was eliminated, and stores these tomographic images U1-UN in the memory unit 42. This process is the same as that used in the first embodiment (step S5).
The control unit 30 causes the noise-eliminated tomographic images U1-UN from the memory unit 42 to be displayed as video on the display unit 31 (step S6). Specifically, the tomographic images Ui are taken as single frames of video, and the tomographic images U1-UN are displayed in sequence at appropriate time intervals. Alternatively, a file in video format may be generated from the tomographic images U1-UN and played back.
In the present embodiment, the tomographic images U1-UN from which noise was eliminated are all stored in the memory unit 42, and then video is displayed. However, if the processing power of the image processing device 3 is great enough, it is possible to perform these processes in real time. Specifically, when the time required for producing a single tomographic image and performing the series of processes on this tomographic image is shorter than the time for performing a single scan in the X direction, the processes in steps S2-S6 in
Also in the second embodiment, the tomographic images Ui from which noise was eliminated may be displayed on the display unit 31 as still images.
According to the image processing device 3 of the second embodiment, an average image serving as the reference for the process of extracting noise pixels is produced for every individual tomographic image solely from the tomographic images having a small photographing time difference relative to the respective tomographic image. Therefore, it is possible to avoid a situation in which a change such as pulsation of a blood vessel that actually occurs in a photographed object is determined to be noise and is eliminated. Additionally, the noise-eliminated tomographic images Ui are displayed as video. This allows an operator to observe movements in blood vessels, thus helping in making a diagnosis.
Patent | Priority | Assignee | Title |
11265490, | Jun 13 2019 | Apple Inc | Systems and methods of spatiotemporal image noise reduction for multispectral image data |
Patent | Priority | Assignee | Title |
5911012, | Jun 30 1995 | U S PHILIPS CORPORATION | Method for the temporal filtering of the noise in an image of a sequence of digital images, and device for carrying out the method |
20030228067, | |||
20110211758, | |||
20110234842, | |||
20110274350, | |||
20120314107, | |||
EP2189109, | |||
JP2005286689, | |||
JP2008131580, | |||
JP2011135566, | |||
JP2012105063, | |||
JP2012253725, | |||
JP3862613, | |||
JP4535125, | |||
JP9128532, | |||
WO2006106919, | |||
WO2008119480, | |||
WO2010007777, | |||
WO2011108231, | |||
WO2012124516, | |||
WO2012222510, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 20 2014 | Kowa Company, Ltd. | (assignment on the face of the patent) | / | |||
Jul 11 2015 | NAKAGAWA, TOSHIAKI | Kowa Company, Ltd | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036297 | /0976 |
Date | Maintenance Fee Events |
Jul 13 2020 | REM: Maintenance Fee Reminder Mailed. |
Dec 28 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Nov 22 2019 | 4 years fee payment window open |
May 22 2020 | 6 months grace period start (w surcharge) |
Nov 22 2020 | patent expiry (for year 4) |
Nov 22 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 22 2023 | 8 years fee payment window open |
May 22 2024 | 6 months grace period start (w surcharge) |
Nov 22 2024 | patent expiry (for year 8) |
Nov 22 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 22 2027 | 12 years fee payment window open |
May 22 2028 | 6 months grace period start (w surcharge) |
Nov 22 2028 | patent expiry (for year 12) |
Nov 22 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |