In an image processing device such as an electronic camera having a detecting section which detects blurring of an image (movement of the image) picked up by an imaging element, a part of evaluation or calculation required for controls (AF, AE, AWB, etc.) is performed in parallel with the detection of the blurring of the image in order to speed up the controls. After completing the detection of the image blurring, an influence of the blurring is corrected to perform final calculation. In this case, an image area as a calculation object can be limited to raise a processing efficiency.
|
1. An image processing device comprising:
a detection unit for detecting blurring of an image picked up with an imaging element;
a simplified calculation unit for calculating an intermediate value of evaluation data of the picked-up image in parallel with a detecting operation of the detection unit;
a memory for memorizing a calculation result of the simplified calculation unit; and
a final calculation unit for calculating a final value of the evaluation data from the intermediate value of the evaluation data memorized in the memory based on the blurring detected with the detection unit after completion of the detection with the detection unit.
9. An image processing device comprising:
an input section for inputting imaging data obtained by imaging;
a simplified evaluation data converting section for converting the imaging data input with the input section into simplified evaluation data;
a storage section for storing the simplified evaluation data converted with the simplified evaluation data converting section;
a movement detecting section for detecting movement of an image from the imaging data input with the input section at a previous imaging time and the imaging data input with the input section at a present imaging time; and
a final evaluation data converting section for converting the simplified evaluation data stored in the storage section into final evaluation data in accordance with the image movement detected with the movement detecting section.
18. An electronic camera comprising:
an imaging section for picking up an image of a subject to acquire imaging data;
a simplified evaluation data converting section for converting the imaging data acquired with the imaging section into simplified evaluation data;
a storage section for storing the simplified evaluation data converted with the simplified evaluation data converting section;
a camera shake detecting section for detecting camera shake of the electronic camera from the imaging data acquired with the imaging section at a previous imaging time and the imaging data acquired by the imaging section at a present imaging time; and
a final evaluation data converting section for converting the simplified evaluation data stored in the storage section into final evaluation data in accordance with the camera shake of the electronic camera detected with the camera shake detecting section.
2. The image processing device according to
3. The image processing device according to
4. The image processing device according to
5. The image processing device according to
6. The image processing device according to
7. The image processing device according to
8. The image processing device according to
10. The image processing device according to
a reading control section for controlling reading of the simplified evaluation data stored in the storage section based on the image movement detected with the movement detecting section.
11. The image processing device according to
12. The image processing device according to
the evaluation data of the focus detection is evaluation data obtained by cumulative addition of a part of the luminance information.
13. The image processing device according to
the evaluation data of the photometry and the evaluation data of the white balance adjustment are evaluation data obtained by cumulative addition of a part of the imaging data for each color.
14. The image processing device according to
15. The image processing device according to
16. The image processing device according to
17. The image processing device according to
|
This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2004-299198, filed Oct. 13, 2004, the entire contents of which are incorporated herein by reference.
1. Field of the Invention
The present invention relates to an image processing device and an electronic camera, more particularly to an image processing device and an electronic camera having a function such as vibration correction which detects a movement amount of an image.
2. Description of the Related Art
There have been proposed various electronic cameras having a so-called electronic vibration correcting function for correcting blurring of an image generated by hands movement or the like by use of the image obtained as digital data. In electronic vibration correction, a little broader image is acquired beforehand in consideration of movement of the image by the hands movement. After detecting a movement amount of the image by the hands movement, the camera controls in such a manner that a part of the acquired little broader image is cut out (extracted) and read out. By this control, the image of which the vibration is corrected is obtained.
Moreover, for example, in Japanese Patent Application Publication No. 06-46311, there has been proposed a camera which performs an auto focus (AF) calculation, an auto exposure (AE) calculation, or an auto white balance (AWB) calculation at an occurrence of camera shake by the method of the above-described electronic vibration correcting function.
In general, in a case where AF is performed utilizing a digital image, after AD-converting an analog imaging signal obtained with an imaging element, a signal of a predetermined frequency component preferable for performing AF detection is extracted. Moreover, this extracted signal is digitally and cumulatively added up to obtain AF evaluation data, and a calculating section such as a CPU performs a predetermined calculation based on this AF evaluation data to perform known focus detection and focus control.
Here, in Japanese Patent Application Publication No. 06-46311, a little broader image is acquired beforehand in the same manner as in the electronic vibration correction. At the occurrence of hands movement, as shown in
Moreover, in the example of
Furthermore, in the AE or AWB, after changing the position of the area 101 from which the image is to be cut out in response to the movement of the image, the cumulative addition is performed for each color component.
However, in the method of Japanese Patent Application Publication No. 06-46311, no processing is performed concerning an AF control, an AE control, and an AWB control until the movement amount of the image is detected. Therefore, in the method of Japanese Patent Application Publication No. 06-46311, a processing time of the AF control, the AE control, or the AWB control at the occurrence of the hands movement is delayed as much as a time for movement amount detection.
The present invention has been made in view of the above-described situation, and an object is to provide an image processing device capable of performing a control such as the AF, AE, or AWB control at a higher speed, even in a case where the hands movement is occurred, and an electronic camera on which such an image processing device is mounted.
According to the present invention, there is provided an image processing device comprising a detecting section (detection unit, detection circuit or the like) which detects blurring of an image (movement of the image) picked up by an imaging element, wherein evaluation data of the picked-up image is calculated (simplified calculation) to obtain an intermediate value (simplified evaluation data) in parallel with a detecting operation of the detecting section, and the result is recorded in a memory. Moreover, after completion of the detection with the detection unit, a final value of the evaluation data is calculated (final calculation) from the intermediate value of the evaluation data recorded in the memory based on the blurring detected by the detecting section. Since a series of calculation is divided into two before and after the detection of the blurring of the image, a calculation time after the detection of the blurring of the image can be shortened.
Examples of the evaluation data of the image include the evaluation data for use in an AF control, an AE control, an AWB control and the like. For example, in the final calculation, an area can be cut out from the picked-up image to correct an influence of the blurring, and as the evaluation data across the cut our area used for the above described controls, luminance or color information can be calculated. In this case, the area to be cut out from the image is determined based on the blurring detected by the detecting section, and only this cutout area can be regarded as a calculation object. Furthermore, the intermediate values of the evaluation data in the cutout areas in a plurality of images can be accumulated to get the final value of the evaluation data. Furthermore, by integrating a predetermined number of data into one data in this accumulation, the data amount can be reduced.
Moreover, even in the simplified calculation, the calculation can be performed with respect to an only part of the image picked-up by an imaging device as the calculation object. In this case, the data amount to be calculated can also be reduced. Moreover, in the final calculation, the area can be further cut out from the part of the image based on blurring detected thereafter. In this case, the data as the calculation object can be reduced further more.
Obviously, even in the simplified calculation, the predetermined number of data can be integrated into one to reduce the data amount.
For example, the present invention can be understood as an image processing device or an electronic camera comprising: an input section which inputs imaging data obtained by image capturing; a simplified evaluation data converting section which converts the imaging data input by the input section into simplified evaluation data; a storage section which stores the simplified evaluation data converted by the simplified evaluation data converting section; a movement detecting section which detects movement of an image from the imaging data input by the input section at a previous imaging time and the imaging data input by the input section at a present imaging time; and a final evaluation data converting section which converts the simplified evaluation data stored in the storage section into final evaluation data in response to the image movement detected by the movement detecting section.
According to the present invention, the imaging data is once converted into the simplified evaluation data, and therefore a control such as the AF control, the AE control and the AWB control can be speeded up.
According to the present invention, there can be provided an image processing device capable of executing controls such as the AF control, the AE control, and the AWB control at a higher speed even in a case where hands blurring is occurred, and an electronic camera on which such an image processing device is mounted.
These and other features, aspects, and advantages of the apparatus and methods of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Preferred embodiments of the invention are described below with reference to the accompanying drawings.
An operation of the electronic camera constituted as shown in
A luminous flux from a subject (not shown) which has entered via an optical photographing system (not shown) is formed into an image on the CCD 1 as an input member and an imaging member. The CCD 1 converts the subject luminous flux into an analog imaging signal by photoelectric conversion, and an AD converter (not shown) further converts the imaging signal into digital imaging data.
The first AF arithmetic unit 2 extracts luminance (Y) data from the input imaging data, and cumulatively adds up the extracted Y data to obtain simplified AF evaluation data. The first AE arithmetic unit 3 cumulatively adds up the input imaging data for each color component to obtain simplified AE evaluation data. The first AWB arithmetic unit 4 cumulatively adds up the input imaging data for each color component to obtain simplified AWB evaluation data. These simplified evaluation data are sent to the image bus 6 as shown in
Moreover, the preprocess circuit 5 subjects the input imaging data to preprocesses such as noise removal and signal amplification and sends the preprocessed imaging data to the image bus 6. This preprocessed imaging data is stored in the SDRAM 7.
The second AF arithmetic unit 10 cumulatively adds up the simplified AF evaluation data based on the input movement information to obtain final AF evaluation data. The second AE arithmetic unit 11 cumulatively adds up the simplified AE evaluation data based on the input movement information to obtain final AE evaluation data. The second AWB arithmetic unit 12 cumulatively adds up the simplified AWB evaluation data based on the input movement information to obtain final AWB evaluation data. The final evaluation data converted and obtained in the second AF arithmetic unit 10, the second AE arithmetic unit 11, and the second AWB arithmetic unit 12 are sent to the CPU 14 via the CPU bus 9. It is to be noted that the final evaluation data may also be written in the SDRAM.
The CPU 14 performs known focus detection processing, photometry processing, and white balance adjustment processing based on the final evaluation data input from the second AF arithmetic unit 10, the second AE arithmetic unit 11, and the second AWB arithmetic unit 12, respectively.
Moreover, as shown in
Luminance color difference data (YC data) obtained in the image processing block 13 are stored again in the SDRAM 7 via the image bus 6. Thereafter, the YC data stored in the SDRAM 7 is sent to the video encoder 15 as shown in
Next, AF, AE, AWB procedures in the present embodiment will be described in more detail.
First, the AF will be described.
When the imaging data is input into the first AF arithmetic unit 2, the first AF arithmetic unit 2 produces Y data from the input imaging data (step S1). For example, the Y data are obtained by YC separation of the imaging data. Next, a predetermined frequency component is extracted from the Y data obtained in the step S1 using a low pass filter and a high pass filter (step S2). Next, the Y data are cumulatively added up to a predetermined data size to produce the simplified AF evaluation data, and the produced simplified AF evaluation data is stored in the SDRAM 7 (step S3). These processes are performed in parallel with the detection of the movement of the image.
After the completion of the detection of the movement of the image, the second AF arithmetic unit 10 reads the simplified AF evaluation data from the SDRAM 7, and cumulatively adds up the read simplified AF evaluation data to obtain the final AF evaluation data (step S4). Here, to obtain the final AF evaluation data, reading is controlled in such a manner as to cut out data of a predetermined area of the simplified AF evaluation data in accordance with the movement information of the image, and the only data in the area are cumulatively added up. It is to be noted that the above-described reading position control of the imaging data is executed by the CPU 14.
As shown in
When the imaging data is input into the first AE arithmetic unit 3, the first AE arithmetic unit 3 accumulates the input imaging data for each color component to produce the simplified AE evaluation data (step S11). This processing is performed in the first AE arithmetic unit 3 in parallel with the detection of the movement of the image.
After the completion of the detection of the movement of the image, the second AE arithmetic unit 11 reads the simplified AE evaluation data from the SDRAM 7, and cumulatively adds up the read simplified AE evaluation data for each color component to obtain the final AE evaluation data (step S12). To obtain this final AE evaluation data, the reading is controlled in such a manner as to cut out the data of the predetermined area of the simplified AF evaluation data 41 in accordance with the image movement information, and the only data in the area is cumulatively added up. Here, the data size at a time of the calculation of the final evaluation data differs with the AE and the AWB.
According to such a method and configuration, a processing time after the detecting of the movement of the image can be shortened with respect to the AE or the AWB in the same manner as in the AF.
A cumulative adding operation performed in the present embodiment will be described in more detail. In the CCD 1, a partial area is read from the imaging data 40 shown in
Moreover, in the AF, the predetermined number of simply adjacent data may be cumulatively added up, but the cumulative addition is performed for each color in the AE or the AWB. For example, when a pixel arrangement of the imaging element is [RGB] Bayer arrangement, the cumulative addition is performed for each color of R, Gr, Gb, and B.
In a second AF, the second AF arithmetic unit 10 determines an area to be cut out from the simplified evaluation data 41 based on the image movement detected with the movement detection block 8, and further accumulates the data of the determined cutout area every predetermined number of data to obtain the final evaluation data 42. It is to be noted that in
The present invention has been described above based on the embodiment, but the present invention is not limited to the above-described embodiment, and the present invention can be variously modified or applied within the scope of the present invention.
For example, in the example of
While there has been shown and described what are considered to be preferred embodiments of the invention, it will, of course, be understood that various modifications and changes in form or detail could readily be made without departing from the spirit of the invention. It is therefore intended that the invention not be limited to the exact forms described and illustrated, but constructed to cover all modifications that may fall within the scope of the appended claims.
Tanaka, Yoshinobu, Yanada, Takashi, Ueno, Akira
Patent | Priority | Assignee | Title |
10178406, | Nov 06 2009 | Qualcomm Incorporated | Control of video encoding based on one or more video capture parameters |
8837576, | Nov 06 2009 | Qualcomm Incorporated | Camera parameter-assisted video encoding |
Patent | Priority | Assignee | Title |
JP6046311, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 12 2005 | TANAKA, YOSHINOBU | Olympus Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017092 | /0128 | |
Sep 12 2005 | UENO, AKIRA | Olympus Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017092 | /0128 | |
Sep 12 2005 | YANADA, TAKASHI | Olympus Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 017092 | /0128 | |
Oct 11 2005 | Olympus Corporation | (assignment on the face of the patent) | / | |||
Apr 01 2016 | Olympus Corporation | Olympus Corporation | CHANGE OF ADDRESS | 039344 | /0502 |
Date | Maintenance Fee Events |
Jul 24 2009 | ASPN: Payor Number Assigned. |
Jul 25 2012 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Oct 07 2016 | REM: Maintenance Fee Reminder Mailed. |
Feb 24 2017 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 24 2012 | 4 years fee payment window open |
Aug 24 2012 | 6 months grace period start (w surcharge) |
Feb 24 2013 | patent expiry (for year 4) |
Feb 24 2015 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 24 2016 | 8 years fee payment window open |
Aug 24 2016 | 6 months grace period start (w surcharge) |
Feb 24 2017 | patent expiry (for year 8) |
Feb 24 2019 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 24 2020 | 12 years fee payment window open |
Aug 24 2020 | 6 months grace period start (w surcharge) |
Feb 24 2021 | patent expiry (for year 12) |
Feb 24 2023 | 2 years to revive unintentionally abandoned end. (for year 12) |