A method for calibrating auto white balancing in an electronic camera includes (a) obtaining a plurality of color values from a respective plurality of images of real-life scenes captured by the electronic camera under a first illuminant, (b) invoking an assumption about a true color value of at least portions of the real-life scenes, and (c) determining, based upon the difference between the true color value and the average of the color values, a plurality of final auto white balance parameters for a respective plurality of illuminants including the first illuminant. An electronic camera device includes an image sensor for capturing real-life images of real-life scenes, instructions including a partly calibrated auto white balance parameter set and auto white balance self-training instructions, and a processor for processing the real-life images according to the self-training instructions to produce a fully calibrated auto white balance parameter set specific to the electronic camera.
|
14. An electronic camera device comprising:
an image sensor for capturing images of real-life scenes;
a processor; and
a non-volatile memory including
(a) a partly calibrated auto white balance parameter set consisting of a plurality of initial auto white balance parameters with only one precalibrated auto white balance parameter, associated with a reference illuminant, and
(b) machine-readable auto white balance self-training instructions that, when executed by the processor, process a subset of the images to produce a fully calibrated auto white balance parameter set specific to the electronic camera, the subset of the images being captured under a first illuminant different from the reference illuminant.
1. A method for calibrating auto white balancing in an electronic camera, comprising:
obtaining, using a processor onboard the electronic camera, a plurality of first color values from a respective first plurality of images of a respective plurality of real-life scenes captured by the electronic camera under a first illuminant, the electronic camera including a plurality of initial auto white balance parameters stored in memory onboard the electronic camera and having only one precalibrated auto white balance parameter, the precalibrated auto white balance parameter being associated with a reference illuminant different from the first illuminant;
invoking, using the processor, an assumption about a true color value of at least portions of the real-life scenes, the assumption being stored in the memory; and
determining, using a self-training module onboard the electronic camera, a plurality of final auto white balance parameters for a respective plurality of illuminants including the first illuminant and the reference illuminant, the self-training module including the processor and machine-readable self-training instructions stored in the memory that, when executed by the processor, perform the step of determining based upon the true color value, average of the first color values, and the initial auto white balance parameters.
2. The method of
determining, based upon difference between the true color value and the average of the first color values, the final first auto white balance parameter; and
transforming the plurality of initial auto white balance parameters, comprising an initial first auto white balance parameter for the first illuminant, to produce the plurality of final auto white balance parameters, the initial first auto white balance parameter being transformed to the final first auto white balance parameter, the step of transforming including using the processor to (a) retrieve the initial auto white balance parameters from the memory, and (b) execute machine-readable transformation instructions stored in the memory to transform the initial auto white balance parameters to the final auto white balance parameters.
3. The method of
each of the first images having color defined by a first, second, and third primary color; and
the step of transforming being performed in a two-dimensional space spanned by an ordered pair of a first color ratio and a second color ratio, the first and second color ratios together defining the relative values of the first, second, and third primary colors.
4. The method of
5. The method of
6. The method of
obtaining, from images captured by a second electronic camera, a plurality of base auto white balance parameters comprising a base auto white balance parameter for the reference illuminant;
calibrating, from images captured by the electronic camera, the base auto white balance parameter to produce the precalibrated auto white balance parameter; and
transforming the base auto white balance parameter set to produce the initial auto white balance parameter set, the initial second auto white balance parameter being the precalibrated auto white balance parameter.
7. The method of
the step of calibrating comprising capturing, by the electronic camera, a second plurality of images of one or more scenes under the reference illuminant; and
the precalibrated auto white balance parameter, when applied to white balance the second plurality of images, yielding an average color of the second plurality of images that is gray.
8. The method of
the step of calibrating comprising capturing, by the electronic camera, a second plurality of images of one or more scenes under the reference illuminant, each of the one or more scenes comprising a human face; and
the precalibrated auto white balance parameter, when applied to white balance the second plurality of images, yielding an average hue of the human faces that is a universal human facial hue.
9. The method of
10. The method of
each of the first color values being an average color of the respective image; and
the true color value being an average color of the plurality of real-life scenes, the assumption being that the average color is gray.
11. The method of
each of the first plurality of images comprising at least one human face;
each of the first color values defining an average hue of the at least one human face; and
the true color value being an average hue of human faces in the plurality of real-life scenes, the assumption being that the average hue is a universal human facial hue.
12. The method of
13. The method of
15. The device of
16. The device of
17. The device of
18. The device of
illumination identification instructions that, when executed by the processor, identify the subset of the images captured under the first illuminant; and
auto white balance parameter transformation instructions that, when executed by the processor, transform the partly calibrated auto white balance parameter set to the fully calibrated auto white balance parameter set based on analysis of the images identified using the illumination identification instructions.
19. The device of
|
This application claims priority to U.S. Patent Application Ser. No. 61/780,898, filed Mar. 13, 2013, the disclosure of which is incorporated herein by reference.
White balance is the process of removing unrealistic color casts from images captured by an electronic camera, such that the images provide a true color representation of a scene. For example, objects in the scene that appear white to human eyes are rendered white by white balancing the initial output of an image sensor. Human eyes are very good at judging what is white under different light sources, but image sensors often have great difficulty doing so, and often create unsightly blue, orange or green color casts. Different illuminants, i.e., light sources, have their unique spectral characteristics. The spectral characteristics of a given illuminant may be represented by its color temperatures. The color temperature of a light source is the temperature of an ideal black body radiator that radiates light of comparable hue to the light source. The color temperature refers to the relative warmth or coolness of white light. As the color temperature rises, the light energy increases. Hence, the wavelengths of light emitted by the illuminant become shorter, i.e., shift towards the blue portion of the visible spectrum, and the color hue becomes cooler.
An image sensor capturing images of a scene illuminated by a given illuminant will initially produce images with colors affected by the color temperature of the illuminant. Therefore, many electronic cameras use automatic white balance (AWB) to correct for the color output of the image sensor according to the illuminant. In order to apply AWB, the electronic camera must have AWB parameters, often represented as gains to color channels, for each illuminant. The AWB unit of an electronic camera first determines which illuminant is being used to illuminate the scene. Next, the AWB unit applies the AWB parameters of that illuminant to the image of the scene to provide an image with a more true representation of the colors of the scene.
Typically, to produce a set of AWB parameters for an electronic camera, the electronic camera captures images of a gray object, such as a specially made gray card, under various color temperature illumination conditions representing the range of illuminants encountered in actual use. For example, images are captured under four different reference illuminants: a D65 light source which corresponds to noon daylight and has a color temperature of 6504 degrees, a cool white fluorescent (CWF) lamp with a color temperature of 4230 degrees K, a TL84 fluorescent lamp with a color temperature of 4000 K, and light source A (incandescent tungsten) with a color temperature of 2856 K. Ideally, a manufacturer of electronic cameras with an AWB function, should perform this calibration procedure for each electronic camera produced. However, such a practice is generally too expensive. A common practice in the image sensor industry is to calibrate one or a small number of electronic cameras, called the golden modules, under various illumination conditions, and then apply the resulting AWB parameter set to all other image sensors. However, sensor-by-sensor variation inherently exists due to variation in the spectral properties of, e.g., the spectral properties of the quantum efficiency, the color filter array, and the infrared-cut filter of the image sensor. As a result, using the golden module AWB parameter set for all other image sensors frequently leads to errors.
In an embodiment, a method for calibrating auto white balancing in an electronic camera includes (a) obtaining a plurality of first color values from a respective first plurality of images, of a respective plurality of real-life scenes, captured by the electronic camera under a first illuminant, (b) invoking an assumption about a true color value of at least portions of the real-life scenes, and (c) determining, based upon the difference between the true color value and the average of the first color values, a plurality of final auto white balance parameters for a respective plurality of illuminants including the first illuminant.
In an embodiment, an electronic camera device includes (a) an image sensor for capturing real-life images of real-life scenes, (b) a non-volatile memory with machine-readable instructions, the instructions including a partly calibrated auto white balance parameter set and auto white balance self-training instructions, and (c) a processor for processing the real-life images according to the self-training instructions to produce a fully calibrated auto white balance parameter set, wherein the fully calibrated auto white balance parameter set is specific to the electronic camera.
Disclosed herein are devices and methods for calibrating AWB parameters of an electronic camera that partly relies on automated self-training of the camera during initial use by an actual user. The automated self-training completes the AWB calibration procedure to provide a fully calibrated AWB function while relieving the manufacturer of cost-prohibitive calibration expenses. The AWB calibration procedure includes at least three main steps. First, a golden module electronic camera is used to generate a base AWB parameter set that covers illuminants having a range of color temperatures. The base AWB parameter set is applied to all the electronic cameras associated with the golden module electronic camera, for example all cameras of the same model or all cameras from the same production run. Next, the AWB parameter for a single reference illuminant, such as the D65 illuminant, is calibrated for each individual electronic camera. After this step, the camera is shipped to a user. Finally, a second AWB parameter for another illuminant is calibrated through automated self-training of the electronic camera during normal use by the user. After calibration of the second AWB parameter through automated self-training, the full set of AWB parameters is transformed according to the two calibrated AWB parameters.
As is evident by the dispersion of AWB parameters 220, 222, 224, and 226 in diagram 200, the respective illuminants D65, TL84, CWF, and A have different color compositions. For example, illuminant D65 (label 220) is shifted towards the blue end of the visible spectrum, while illuminant A (label 226) is shifted towards the red and green portions of visible spectrum. Illuminants TL84, CWF, and A are redder and less blue than illuminant D65. This illustrates the importance of proper white balancing of images captured by an electronic camera according to the illuminant illuminating the scene. For example, an image captured under illuminant A may appear to have a red color cast if the image is not white balanced. White balancing of an image captured under illuminant A is achieved by correcting the colors of the image according to the ordered pair of color ratios associated with illuminant A in diagram 200. Under the stated assumption that the ordered pair is [G/B, G/R], blue and red color components of the image are multiplied by the respective color ratios of horizontal axis 210 and vertical axis 212. By characterizing the illuminants according to the color ratios G/B and G/R, diagram 200, or any equivalent graphical or non-graphical representation thereof, conveniently provides the color gains to be used to white balance the image. Other examples of ordered pairs, such as [R*B/G2, B/R] will provide the same color gains after a simple algebraic manipulation.
For example, processor 330 analyzes the captured images according to instructions 350 and, based thereupon, saves images deemed suitable for AWB self-training to data storage 360. When a sufficient number of images suitable for AWB self-training have been stored to data storage 360, processor 330 analyzes the stored images according to instructions 350 to determine the final AWB parameter set. Temporary values and results generated by processor 330 during this process may be stored to data storage 360 or kept in a working memory not shown in
Processor 330, instructions 350, and data storage 360 together constitute and embodiment of self-training module 120 of
Images captured by image sensor 310 and, optionally, white balanced by processor 330 may be outputted to a user through interface 380. Interface 380 may include, e.g., a display and a wired or wireless communication port. Interface 380 may further be used to received instructions and other data from an outside source such as a user.
Data storage 460 is an embodiment of data storage 360 (
In certain embodiments, data storage 460 further includes an initial AWB parameter set 464, which is a partially calibrated AWB parameter set that is either provided with the electronic camera, e.g., electronic camera 300 (
In a step 510, a base AWB parameter set is obtained from the calibration of an associated golden module electronic camera under several illuminants. Diagram 200 of
In a step 520, the electronic camera captures images under a reference illuminant, where the reference illuminant is one of the illuminants used to produce the base AWB parameter set obtained in step 510. For example, prior to shipping electronic camera 300 (
In a step 540, the base AWB parameter set obtained in step 510 is transformed into an initial AWB parameter set, such that the initial AWB parameter for the reference illuminant is that obtained in step 530. In an embodiment, step 540 is performed by the manufacturer and the resulting initial AWB parameter set is stored to the electronic camera, e.g., electronic camera 300 (
In a step 550, images of real-life scenes are captured using the electronic camera. Step 550 is, for example, performed by a user who captures images of real-life scenes using electronic camera 300 (
In a step 560, the initial AWB parameter set generated in step 540 is further transformed according to the calibration, generated in step 555, of the AWB parameter for the first illuminant. This produces a final AWB parameter set calibrated specifically to this particular electronic camera. The final AWB parameter set includes the calibrated AWB parameters for the reference and first illuminants, generated in steps 540 and 555, respectively. Step 560 is, for example, executed by processor 330 (
Steps 550, 555, and 560 constitute the automated self-training portion of the calibration of AWB parameters for the electronic camera.
In certain embodiments, the transformations performed in steps 540 and 560 of method 500 (
In an embodiment, the initial AWB parameter set generated in step 540 is further translated to place the AWB parameter for the reference illuminant at the origin of the coordinate system in which the transformation is performed. Referring to the example of diagram 600 (
The full AWB calibration procedure, for the electronic camera is as a camera-specific transformation of the base AWB parameter set. The specific calibration of an AWB parameter for a reference illuminant (step 530 of
The average color obtained in step 830 may be different from the actual color of the gray card. For example, the average color may be shifted towards red or blue. In a step 840, the AWB parameter for the reference illuminant is calibrated such that the calibrated AWB parameter, when applied to the average color determined in step 830, yields the color gray, i.e., the actual color of the gray card. In one embodiment, step 840 is performed onboard the electronic camera. For example, processor 330 of electronic camera 300 (
Method 800 describes processing of images in steps 810, 820, and 830 with all images processed by step 810, followed by all images processed by step 820, followed by all images processed by step 830. Images may instead be sequentially processed by two subsequent steps of steps 810, 820, and 830, or all of steps 810, 820, and 830, without departing from the scope hereof.
In a step 910, a color value is determined for each real-life image captured by the electronic camera. In an embodiment, the color value of a real-life image is the average color of the image. Step 910 is, for example, performed by processor 330 of electronic camera 300 (
Returning to
A step 940 invokes the gray world assumption discussed above. For example, processor 330 of electronic camera 300 (
Method 900 describes processing of images in steps 910 and 920 with all images processed by step 910, followed by all images processed by step 920. In an embodiment, the electronic camera, for example electronic camera 300 (
In an embodiment, the electronic camera, e.g., electronic camera 300 of
Method 1100 is similar to method 900 (
The first two steps of method 1100 are steps 910 and 920 of method 900 (
A step 1140 invokes the universal human facial hue assumption discussed above. For example, the universal human facial hue assumption is invoked by processor 330 of electronic camera 300 (
Method 1100 describes processing of images in steps 910, 920, and 1125 with all images processed by step 1110, followed by all images processed by step 920, followed by all images processed by step 1125. In an embodiment, the electronic camera, e.g., electronic camera 300 of
In an embodiment, the electronic camera, e.g., electronic camera 300 of
In comparison to self-training based on the gray world assumption, self-training based on the universal human facial hue assumption may require a smaller number of real-life images to provide an accurate calibration of the AWB parameter for the first illuminant. The reason is that each individual human face has a hue that is very close to the universal human facial hue, while it likely requires a multitude of real-life images to achieve an average color composition that is gray. On the other hand, the electronic camera, e.g., electronic camera 300 of
The average color obtained in step 1230 may represent a different hue than the universal human facial hue. For example, the hue may be shifted towards red or blue as compared to the human facial hue. In a step 1240, the AWB parameter for the reference illuminant is calibrated such that the calibrated AWB parameter, when applied to the average color determined in step 1230, yields a color representative of the universal human facial hue. In one embodiment, step 1240 is performed onboard the electronic camera. For example, processor 330 of electronic camera 300 (
Method 1200 describes processing of images in steps 1210 and 1220 with all images processed by step 1210, followed by all images processed by step 1220. Images may instead be sequentially processed by steps 1210 and 1220, without departing from the scope hereof.
Combinations of Features
Features described above as well as those claimed below may be combined in various ways without departing from the scope hereof. For example, it will be appreciated that aspects of one device or method for automated self-training of auto white balance in electronic cameras described herein may incorporate or swap features of another device or method for automated self-training of auto white balance in electronic cameras described herein. The following examples illustrate possible, non-limiting combinations of embodiments described above. It should be clear that many other changes and modifications may be made to the methods and device herein without departing from the spirit and scope of this invention:
(A) A method for calibrating auto white balancing in an electronic camera may include (i) obtaining a plurality of first color values from a respective first plurality of images, of a respective plurality of real-life scenes, captured by the electronic camera under a first illuminant, (ii) invoking an assumption about a true color value of at least portions of the real-life scenes, and (iii) determining, based upon the difference between the true color value and the average of the first color values, a plurality of final auto white balance parameters.
(B) The method denoted as (A), the plurality of final auto white balance parameters may be associated with a respective plurality of illuminants including the first illuminant.
(C) In the methods denoted as (A) and (B), the plurality of final auto white balance parameters may include a final first auto white balance parameter for the first illuminant.
(D) In the methods denoted as (C), the step of determining may include determining, based upon difference between the true color value and the average of the first color values, the final first auto white balance parameter.
(E) The methods denoted as (C) and (D), may further include transforming a plurality of initial auto white balance parameters that includes an initial first auto white balance parameter for the first illuminant, to produce the plurality of final auto white balance parameters, wherein the initial first auto white balance parameter is transformed to the final first auto white balance parameter.
(F) In the methods denoted as (A) through (E), the step of obtaining may include selecting the first plurality of images from a superset of images, captured by the electronic camera of real-life scenes, wherein each image in the first plurality of images being captured by the first illuminant.
(G) In the methods denoted as (A) through (F), each of the first color values may be an average color of the respective image.
(H) In the method denoted as (G), the true color value may be an average color of the plurality of real-life scenes, where the average color is gray.
(I) In the methods denoted as (A) through (F), each of the first plurality of images may include at least one human face, and each of the first color values may define an average hue of the at least one human face.
(J) In the method denoted as (I), the true color value may be an average hue of human faces in the plurality of real-life scenes, wherein the average hue is a universal human facial hue.
(K) In the methods denoted as (I) and (J), the step of obtaining may include selecting the first plurality of images from a superset of images, captured by the electronic camera of real-life scenes, wherein each image in the first plurality of images is captured by the first illuminant and including at least one human face.
(L) In the method denoted as (K), the step of obtaining may further include applying a face detection routine to the superset of images.
(M) In the methods denoted as (E) through (L), each of the first images may have color defined by a first, second, and third primary color, and the step of transforming may be performed in a two-dimensional space spanned by an ordered pair of a first color ratio and a second color ratio, where the first and second color ratios together defining the relative values of the first, second, and third primary colors;
(N) In the method denoted as (M), the step of transforming may include rotating and scaling the initial white balance parameter set within the two-dimensional space.
(O) In the methods denoted as (M) and (N), the ordered pair may be [second primary color/third primary color, second primary color/first primary color], [first primary color*third primary color/second primary color^2, third primary color/first primary color], [Log(second primary color/third primary color), Log(second primary color/first primary color)], [Log(first primary color*third primary color/second primary color^2), Log(third primary color/first primary color)], or a derivative thereof.
(P) In the methods denoted as (C) through (O), the plurality of initial auto white balance parameters may include an initial second auto white balance parameter for a second illuminant, and the method may further include determining the plurality of initial auto white balance parameters by (i) obtaining a plurality of base auto white balance parameters including a base second auto white balance parameter for the second illuminant, (ii) calibrating the base second auto white balance parameter to produce a calibrated value thereof, and (iii) transforming the base auto white balance parameter set to produce the initial auto white balance parameter set, wherein the initial second auto white balance parameter is the calibrated value.
(Q) In the method denoted as (P), the step of calibrating may include capturing, by the electronic camera, a second plurality of images of one or more scenes under the second illuminant, such that the calibrated value, when applied to white balance the second plurality of images, yields an average color of the second plurality of images that is gray.
(R) In the method denoted as (P), the step of calibrating may include capturing, by the electronic camera, a second plurality of images of one or more scenes under the second illuminant, wherein each of the one or more scenes comprising a human face, and the calibrated value, when applied to white balance the second plurality of images, yields an average hue of the human faces that is a universal human facial hue.
(S) In the methods denoted as (P) through (R), the plurality of base auto white balance parameters may be determined from images captured by a second electronic camera.
(T) An electronic camera device may include (i) an image sensor for capturing real-life images of real-life scenes, (ii) a non-volatile memory comprising machine-readable instructions, the instructions comprising a partly calibrated auto white balance parameter set and auto white balance self-training instructions, and (iii) a processor for processing the real-life images according to the self-training instructions to produce a fully calibrated auto white balance parameter set, wherein the fully calibrated auto white balance parameter set is specific to the electronic camera.
(U) In the device denoted as (T), the self-training instructions may include an assumption about the real-life scenes.
(V) In the device denoted as (U), the assumption may include an assumption that the average color of a plurality of the real-life scenes is gray.
(W) In the device denoted as (V), the assumption may include an assumption that the hue of human faces is a universal human facial hue.
(X) In the devices denoted as (T) through (W), the self-training instructions may include illumination identification instructions that, when executed by the processor, identifies a subset of the real-life images captured under a first illuminant.
(Y) In the device denoted as (X), auto white balance parameter transformation instructions that, when executed by the processor, transforms a partly calibrated auto white balance parameter set to a fully calibrated auto white balance parameter set based on analysis of the images identified using the illumination identification instructions.
(Z) In the devices denoted as (T) through (Y), the self-training instructions may further include face detection instructions that, when executed by the processor, identifies human faces in real-life images.
Changes may be made in the above methods and devices without departing from the scope hereof. It should thus be noted that the matter contained in the above description and shown in the accompanying drawings should be interpreted as illustrative and not in a limiting sense. The following claims are intended to cover generic and specific features described herein, as well as all statements of the scope of the present method and device, which, as a matter of language, might be said to fall therebetween.
Patent | Priority | Assignee | Title |
10719729, | Jun 06 2018 | PERFECT CORP.; PERFECT CORP | Systems and methods for generating skin tone profiles |
10863059, | Jun 26 2018 | Microsoft Technology Licensing, LLC | Environmental-based spatial color uniformity correction by using statistical distribution of camera modules |
11317063, | Jan 21 2020 | Samsung Electronics Co., Ltd. | Calibration module of image sensor, image sensor and method of calibrating crosstalk in image sensor |
Patent | Priority | Assignee | Title |
7570881, | Feb 21 2006 | Nokia Technologies Oy | Color balanced camera with a flash light unit |
8731277, | Jan 26 2011 | Aptina Imaging Corporation | Methods for matching gain and color for stereoscopic imaging systems |
9113114, | May 12 2010 | Samsung Electronics Co., Ltd; SAMSUNG ELECTRONICS CO , LTD | Apparatus and method for automatically controlling image brightness in image photographing device |
20110279710, | |||
20120189191, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 13 2014 | OmniVision Technologies, Inc. | (assignment on the face of the patent) | / | |||
May 08 2014 | LIU, CHANGMENG | OmniVision Technologies, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032867 | /0045 |
Date | Maintenance Fee Events |
Jul 16 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jul 13 2023 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 23 2019 | 4 years fee payment window open |
Aug 23 2019 | 6 months grace period start (w surcharge) |
Feb 23 2020 | patent expiry (for year 4) |
Feb 23 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 23 2023 | 8 years fee payment window open |
Aug 23 2023 | 6 months grace period start (w surcharge) |
Feb 23 2024 | patent expiry (for year 8) |
Feb 23 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 23 2027 | 12 years fee payment window open |
Aug 23 2027 | 6 months grace period start (w surcharge) |
Feb 23 2028 | patent expiry (for year 12) |
Feb 23 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |