An image compensating apparatus includes an input unit to receive input of an photo image, a gaussian filtering unit to perform gaussian filtering on the photo image, a division-map generating unit to convert the photo image on which the gaussian filtering is performed into a color space including a plurality of color components and then generating a chromaticity division-map based on a color coordinate value in the color space, a calculating unit to calculate average chromaticity of each respective region by using the chromaticity division-map, and a compensating unit to compensate chromaticity of each respective region of the photo image by using the average chromaticity of each respective region. Accordingly, a faded image input to the input unit may effectively be compensated.
|
8. A method of compensating an image, the method comprising:
when a photo image is input, performing gaussian filtering on the photo image;
converting the photo image on which the gaussian filtering is performed into a color space comprising a plurality of color components, and then generating a chromaticity division-map for dividing the photo image on which the gaussian filtering is performed into regions according to chromaticity based on a color coordinate value in the color space;
calculating average chromaticity of each respective region among the regions by using the chromaticity division-map; and
compensating chromaticity of each respective region of the photo image by using the average chromaticity of each respective region.
1. An image compensating apparatus comprising:
an input unit to receive input of an photo image;
a gaussian filtering unit to perform gaussian filtering on the photo image;
a division-map generating unit to convert the photo image on which the gaussian filtering is performed into a color space comprising a plurality of color components and then generating a chromaticity division-map for dividing the photo image on which the gaussian filtering is performed into regions according to chromaticity based on a color coordinate value in the color space;
a calculating unit to calculate average chromaticity of each respective region among the regions by using the chromaticity division-map; and
a compensating unit to compensate chromaticity of each respective region of the photo image by using the average chromaticity of each respective region.
15. An image processing apparatus comprising:
a display unit to display a selection window about a plurality of methods of compensating chromaticity; and
a chromaticity compensating unit to compensate chromaticity of a photo image by using a chromaticity compensating method that is selected in the selection window,
Wherein the chromaticity compensating method comprises a compensating method based on an L-Cub-Crag color space and at least one method based on estimation of a light source, and
wherein the compensating method based on the L-Cub-Crag color space is a chromaticity compensating method comprising performing gaussian filtering on the photo image, converting the photo image into the L-Cub-Crag color space, generating a chromaticity division-map of regions of the photo image based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region of the regions by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated average chromaticity of each respective region.
18. A method of processing an image, the method comprising:
displaying photo images compensated according to a plurality of methods of compensating chromaticity, respectively;
when one photo image among the displayed photo images is selected, compensating an original photo image by using a chromaticity compensating method corresponding to the selected photo image; and
performing at least one operation of printing, transmitting and storing the compensated photo image,
wherein the plurality of methods of compensating chromaticity comprise a compensating method based on an L-Cyb-Crg color space and at least one compensating method based on estimation of a light source, and
wherein the compensating method based on an L-Cyb-Crg color space is a chromaticity compensating method comprising performing gaussian filtering on a photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map for dividing the photo image on which the gaussian filtering is performed into regions according to chromaticity based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region among the regions by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated chromaticity of each respective region.
17. A method of processing an image in an image processing apparatus, the method comprising:
when a program to compensate a photo image is operated, displaying an interface window to compensate chromaticity;
when the photo image to be compensated is selected in the interface window, displaying the selected photo image, and receiving selection of a region to be compensated of the photo image;
when the region to be compensated is selected, compensating the selected region by using a basic compensating method;
displaying an image that is compensated by using the basic compensating method;
when a menu to check a result obtained by using a different compensating method is selected, displaying at least one image that is compensated by using the different compensating method;
when chromaticity compensation using the basic compensating method or the different compensating method is completed, displaying a menu to adjust saturation and contrast; and
when adjustment of the saturation and the contrast are completed, displaying an image comprising at least one executing command about a final image,
wherein any one of the basic compensating method and the different compensating method is a compensating method based on an L-Cyb-Crg color space comprising performing gaussian filtering on the photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map of regions of the photo image based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region of the regions by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated average chromaticity of each respective region.
2. The image compensating apparatus of
wherein the division-map generating unit converts the photo image on which gaussian filtering is performed into an L-Cyb-Crg color space, and generates the chromaticity division-map based on a Cyb-Crg color coordinate value, and
wherein the compensating unit compensates the chromaticity by using a CMY value converted by the converting unit and the average chromaticity of each respective region.
3. The image compensating apparatus of
4. The image compensating apparatus of
wherein, when the Cyb value is equal to or greater than a first predetermined threshold value, the division-map generating unit determines a corresponding pixel to yellow,
wherein, when the Cyb value is less than the first predetermined threshold value, the division-map generating unit determines the corresponding pixel to blue,
wherein, when the Crg value is equal to or greater than a second predetermined threshold value, the division-map generating unit determines the corresponding pixel to red, and
wherein, when the Crg value is less than the second predetermined threshold value, the division-map generating unit determines the corresponding pixel to green.
5. The image compensating apparatus of
a saturation and contrast adjusting unit to adjust saturation and contrast of an image of which chromaticity is compensated by the compensating unit.
6. The image compensating apparatus of
wherein ‘NPn’ is the number of pixels of a region ‘n’, ‘n’ is each region, and R(x,y,n), G(x,y,n), and B(x,y,n) are R, G and B values of each pixel of the region ‘n’, respectively.
7. The image compensating apparatus of
wherein Wi(x,y) is a weight of pixels of each channel in the CMY color space, Lavg(n) is average lightness of each respective region, and li(x,y,n) is chromaticity of each respective pixel of an input photo image.
9. The method of
converting the photo image on which the gaussian filtering is performed into a C-M-Y color space,
wherein the generating of the chromaticity division-map comprises converting the photo image on which gaussian filtering is performed into an L-Cyb-Crg color space, and generating the chromaticity division-map based on a Cyb-Crg color coordinate value, and
wherein the compensating comprises compensating the chromaticity by using a CMY value converted from the photo image and the average chromaticity of each respective region.
10. The method of
11. The method of
12. The method of
adjusting saturation and contrast of an image of which chromaticity is compensated.
13. The method of
wherein ‘NPn’ is the number of pixels of a region ‘n’, ‘n’ is each region, and R(x,y,n), G(x,y,n), and B(x,y,n) are R, G and B values of each pixel of the region ‘n’, respectively.
14. The method of
wherein Wi(x,y) is a weight of pixels of each channel in the CMY color space, Lavg(n) is average lightness of each respective region, and li(x,y,n) is chromaticity of each respective pixel of an input photo image.
16. The image processing apparatus of
a controller to control the display unit to display a photo image that is compensated by the chromaticity compensating unit,
wherein the photo image comprises at least one of an image captured by an imaging device installed in the image processing apparatus, an image transmitted from an external device, and an image read from a recording medium installed inside or outside the image processing apparatus.
19. The method of
|
This application claims priority under 35 U.S.C. §119(a) from Korean Patent Application No. 10-2011-0035909, filed on Apr. 18, 2011, in the Korean Intellectual Property Office, the contents of which are incorporated herein by reference in its entirety.
1. Field of the General Inventive Concept
The present general inventive concept relates to a method and apparatus to compensate an image, and a method and apparatus to process an image, and more particularly, to a method and apparatus to compensate a faded image, and a method and apparatus to process a faded image.
2. Description of the Related Art
Photos have been used for a long period of time as a medium for recalling the past. Recently, as digital cameras have been rapidly spread, many users store photos in a memory, or upload photos to a webpage without printing photos, in order to see photos. However, even in such a situation, a considerable number of users print and use photos.
In general, chromaticity of an image captured by a camera varies according to chromaticity of a light source used in capturing process. However, as time passes, chromaticity of a photo is changed, and thus the photo becomes faded generally. In particular, a speed or degree of fading is determined according to an environment such as a temperature or humidity, the characteristics of photographic paper, or the characteristics of material used to print the photo.
Thus, if a user does not have the original data or film of a faded photo, the user may not obtain a photo having original chromaticity.
Thus, technologies of compensating faded photos to have original chromaticity have been developed. However, these technologies are used to compensate a photo of which chromaticity of all parts of the photo is uniformly changed. Thus, it is difficult to compensate a photo of which chromaticity is nonuniformly changed.
Therefore, there is a need for a technology of compensating a photo image to have chromaticity similar to original chromaticity.
The present general inventive concept provides a method and apparatus to compensate an image, and a method and apparatus to process an image, thereby effectively compensating a faded image.
Additional features and utilities of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
The foregoing and/or other features and utilities of the present general inventive concept may be achieved by providing an image compensating apparatus including: an input unit to receive input of an photo image, a Gaussian filtering unit to perform Gaussian filtering on the photo image, a division-map generating unit to convert the photo image on which the Gaussian filtering is performed into a color space including a plurality of color components and then generating a chromaticity division-map based on a color coordinate value in the color space, a calculating unit to calculate average chromaticity of each respective region by using the chromaticity division-map, and a compensating unit to compensate chromaticity of each respective region of the photo image by using the average chromaticity of each respective region.
The image compensating apparatus may further include a converting unit to convert the photo image on which the Gaussian filtering is performed into a C-M-Y color space, wherein the division-map generating unit converts the photo image on which Gaussian filtering is performed into an L-Cyb-Crg color space, and generates the chromaticity division-map based on a Cyb-Crg color coordinate value, and wherein the compensating unit compensates the chromaticity by using a CMY value converted by the converting unit and the average chromaticity of each respective region.
The compensating unit may compensate the photo image by using a weight obtained by converting a size of each channel in a Cyb-Crg space into a CMY value.
The division-map generating unit may check Cyb and Crg values of each pixel, wherein, when the Cyb value is equal to or greater than a first predetermined threshold value, the division-map generating unit determines a corresponding pixel to yellow, wherein, when the Cyb value is less than the first predetermined threshold value, the division-map generating unit determines the corresponding pixel to blue, wherein, when the Crg value is equal to or greater than a second predetermined threshold value, the division-map generating unit determines the corresponding pixel to red, and wherein, when the Crg value is less than the second predetermined threshold value, the division-map generating unit determines the corresponding pixel to green.
The image compensating apparatus may further include a saturation and contrast adjusting unit to adjust saturation and contrast of an image of which chromaticity is compensated by the compensating unit.
The calculating unit may calculate average chromaticity of each CMY value according to the following equation:
wherein ‘NPn’ is the number of pixels of a region ‘n’, ‘n’ is each region, and R(x,y,n), G(x,y,n), and B(x,y,n) are R, G and B values of each pixel of the region ‘n’, respectively.
The compensating unit may compensate chromaticity of each respective region of the photo image according to the following equation:
wherein Wi(x,y) is a weight of pixels of each channel in the CMY color space, Lavg(n) is average lightness of each respective region, and li(x,y,n) is chromaticity of each respective pixel of an input photo image.
The foregoing and/or other features and utilities of the present general inventive concept may also be achieved by providing a method of compensating an image, the method including: when a photo image is input, performing Gaussian filtering on the photo image, converting the photo image on which the Gaussian filtering is performed into a color space including a plurality of color components, and then generating a chromaticity division-map based on a color coordinate value in the color space, calculating average chromaticity of each respective region by using the chromaticity division-map, and compensating chromaticity of each respective region of the photo image by using the average chromaticity of each respective region.
The method may further include converting the photo image on which the Gaussian filtering is performed into a C-M-Y color space, wherein the generating of the chromaticity division-map may include converting the photo image on which Gaussian filtering is performed into an L-Cyb-Crg color space, and generating the chromaticity division-map based on a Cyb-Crg color coordinate value, and wherein the compensating includes compensating the chromaticity by using a CMY value converted from the photo image and the average chromaticity of each respective region.
The compensating may include compensating the photo image by using a weight obtained by converting a size of each channel in a converted Cyb-Crg space into a CMY value.
The generating of the chromaticity division-map may include checking Cyb and Crg values of each pixel, when the Cyb value is equal to or greater than a first predetermined threshold value, determining a corresponding pixel to yellow, when the Cyb value is less than the first predetermined threshold value, determining the corresponding pixel to blue, when the Crg value is equal to or greater than a second predetermined threshold value, determining the corresponding pixel to red, and when the Crg value is less than the second predetermined threshold value, determining the corresponding pixel to green.
The method may further include adjusting saturation and contrast of an image of which chromaticity is compensated.
The calculating may include average chromaticity of each CMY value according to the following equation:
wherein ‘NPn’ is the number of pixels of a region ‘n’, ‘n’ is each region, and R(x,y,n), G(x,y,n), and B(x,y,n) are R, G and B values of each pixel of the region ‘n’, respectively.
The compensating may include compensating chromaticity of each respective region of the photo image according to the following equation:
wherein Wi(x,y) is a weight of pixels of each channel in the CMY color space, Lavg(n) is average lightness of each respective region, and li(x,y,n) is chromaticity of each respective pixel of an input photo image.
The foregoing and/or other features and utilities of the present general inventive concept may also be achieved by providing an image processing apparatus including a display unit to display a selection window about a plurality of methods of compensating chromaticity, and a chromaticity compensating unit to compensate chromaticity of a photo image by using a chromaticity compensating method that is selected in the selection window, wherein the chromaticity compensating method may include a compensating method based on an L-Cyb-Crg color space and at least one method based on estimation of a light source.
The image processing apparatus may further include a controller to control the display unit to display a photo image that is compensated by the chromaticity compensating unit, wherein the photo image may include at least one of an image captured by an imaging device installed in the image processing apparatus, an image transmitted from an external device, and an image read from a recording medium installed inside or outside the image processing apparatus.
The compensating method based on an L-Cyb-Crg color space may be a chromaticity compensating method including performing Gaussian filtering on a photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated chromaticity of each respective region.
The foregoing and/or other features and utilities of the present general inventive concept may also be achieved by providing a method of processing an image in an image processing apparatus, the method including when a program to compensate a photo image is operated, displaying an interface window to compensate chromaticity, when a photo image to be compensated is selected in the interface window, displaying the selected photo image, and receiving selection of a region to be compensated of the photo image, when the region to be compensated is selected, compensating the selected region by using a basic compensating method, displaying an image that is compensated by using the basic compensating method, when a menu to check a result obtained by using a different compensating method is selected, displaying at least one image that is compensated by using at least one different compensating method, when chromaticity compensation using the basic compensating method or the different compensating method is completed, displaying a menu to adjust saturation and contrast, and when the adjustment of the saturation and the contrast is completed, displaying an image including at least one executing command about a final image.
Any one of the basic compensating method and the different compensating method may be a compensating method based on an L-Cyb-Crg color space including performing Gaussian filtering on a photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated chromaticity of each respective region.
The foregoing and/or other features and utilities of the present general inventive concept may also be achieved by providing a method of processing an image, the method including: displaying photo images compensated according to a plurality of methods of compensating chromaticity, respectively, when one photo image among the displayed photo images is selected, compensating an original photo image by using a chromaticity compensating method corresponding to the selected photo image, and performing at least one operation of printing, transmitting and storing the compensated photo image, wherein the plurality of methods of compensating chromaticity may include a compensating method based on an L-Cyb-Crg color space and at least one compensating method based on estimation of a light source, and wherein the compensating method based on an L-Cyb-Crg color space is a chromaticity compensating method including performing Gaussian filtering on a photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated chromaticity of each respective region.
According to the above-described features of the inventive concept, a faded image may be effectively processed.
These and/or other features and utilities of the present general inventive concept will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:
Reference will now be made in detail to exemplary embodiments of the present general inventive concept, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The exemplary embodiments are described below in order to explain the present general inventive concept by referring to the figures.
The image compensating apparatus 100 of
In at least one exemplary embodiment illustrated in
A faded image generally has an overall red appearance. Thus, when color division is performed on the faded image in the L-Cyb-Crg color space, a large amount of a red component is detected in a Crg channel, and a large amount of yellow component is detected in a Cyb channel.
The image compensating apparatus 100 divides the chromaticity of the faded image by using the L-Cyb-Crg color space, and compensates the divided chromaticity of each respective region.
Examples of configurations of the image compensating apparatus 100 are illustrated in
Referring to
The input unit 110 may receive a photo image including an RGB channel. The RGB color channel refers to the red (R), green (G) and blue (B) components of the photo image. Each color channel may be analyzed separately to determine color information residing on a single respective color channel. More specifically, the input unit 110 may receive an image captured by an imaging unit (not illustrated) or a camera, which is placed outside the image compensating apparatus 100, or an image stored in a separate memory (not shown). Alternatively, the input unit 110 may receive an image from an independent device, a module, a unit, a chip, or the like, which is placed outside the image compensating apparatus 100. In this case, the input unit 110 may receive a general image as well as a faded image. That is, an image that is not faded may be compensated by an image compensating apparatus or an image processing apparatus, according to various exemplary embodiments of the inventive concept. In addition, a painting image as well as a photo image may be compensated.
The Gaussian filtering unit 120 performs Gaussian filtering on the photo image received by the input unit 110. The Gaussian filtering unit 120 applies a Gaussian filter to each RGB channel included in the photo image. In detail, the Gaussian filtering may be performed according to Equation 1 below.
Ri=Ii(x,y)*F(x,y)
F(x,y)=Ke−(x
In Equation 1, ‘I’ denotes a RGB channel of an input image. ‘F’ denotes a Gaussian filter. The Gaussian filter may be used so that noise included in the faded image may be blurred, and a contour phenomenon, which may occur when the divided chromaticity is compensated of each respective region, may be reduced.
When the Gaussian filtering unit 120 performs the Gaussian filtering, the division-map generating unit 140 converts an image Ri on which the Gaussian filtering is performed into the L-Cyb-Crg color space. In addition, the division-map generating unit 140 generates a chromaticity division map based on color coordinate values of Cyb-Crg. In this case, the division-map generating unit 140 may determine a space with 8 bits to each pixel, and may determine a bit to each chromaticity axis so as to define chromaticity of each pixel. In detail, as illustrated in Table 1 below, a bit corresponding to each pixel position is set based on the divided chromaticity.
TABLE 1
Y
B
R
G
1
0
1
0
1
0
1
0
The division-map generating unit 140 may mark bits, as illustrated in Table 1, may compare a chromaticity of each bit, and may obtain chromaticity values of corresponding positions. Thus, an image that is blurred by performing the Gaussian filtering may be divided into regions according to chromaticity. A division map will be described later.
The calculating unit 150 calculates average chromaticity of each respective region by using the division map generated by the division-map generating unit 140. In detail, the calculating unit 150 estimates faded chromaticity in each of the divided regions in the division map. In this case, an RGB image that is Gaussian-blurred is converted into the CMY color space. After being converted into the CMY color space, a division value may be used to calculate the average chromaticity of each respective region, and also the average chromaticity of each respective channel. In this case, chromaticity is estimated and compensated based on gray world assumption. For example, applying the gray world assumption to a printing medium allows for an assumption that an average of each respective channel in an image is gray. Further, under the gray world assumption, chromaticity of an image is divided in a complementary color space having values of LCybCrg, and faded chromaticity of each respective region, is estimated based on the divided chromaticity.
The calculating unit 150 applies the gray world assumption to the faded image in the CMY color space. The calculating unit 150 may calculate the average chromaticity of each CMY value according to Equation 2 below.
In Equation 2, ‘NPn’ denotes the number of pixels of a region ‘n’, ‘n’ denotes each region, and R(x,y,n), G(x,y,n), and B(x,y,n) denote R, G and B values of each pixel of the region ‘n’, respectively.
The converting unit 130 converts the photo image on which the Gaussian filtering is performed, or a photo image input to the converting unit 130 through the input unit 110 into the CMY color space. That is, as described above, during a calculating operation of the calculating unit 150, the RGB image that is Gaussian-blurred is converted into the CMY color space. This operation may be performed by the converting unit 130, and then, the resulting values may be provided to the calculating unit 150.
In addition, the converting unit 130 may convert an input image itself into the CMY color space, and may provide the resulting values to the compensating unit 160.
The compensating unit 160 compensates chromaticity of each respective region of the photo image by using values of CMY converted by the converting unit 130, and values calculated by the calculating unit 150, including the average chromaticity values of each CMY value. More specifically, the compensating unit 160 may remove the faded chromaticity by multiplying the average chromaticity of each respective region, which is calculated by the calculating unit 150, by an average chromaticity ratio to lightness. That is, the compensating unit 160 may perform compensation according to Equation 3 below.
In Equation 3, Wi(x,y) denotes a weight of pixels of each channel in the CMY color space. That is, Wi(x,y) denotes a weight obtained by converting a size of each channel in a converted Cyb-Crg space into a CMY value.
More specifically, in Equation 3, in order to determine Wi(x,y), a value obtained by converting an image on which Gaussian convolution is performed into Cyb-Crg is used. In this case, when Cyb(x,y) is equal to or greater than 0, Cyb(x,y) is determined to as Yellow. When Cyb(x,y) is less than 0, Cyb(x,y) is determined to as Blue. When Crg(x,y) is equal to or greater than 0, Crg(x,y) is determined to as Red. When Crg(x,y) is less than 0, Crg(x,y) is determined to Green. Cyb(x,y) and Crg(x,y) are normalized for each respective channel by dividing Cyb(x,y) and Crg(x,y) by a maximum value Ymax, Bmax, Rmax, and Gmax in respective regions Y, R, G, and B, and values expressed in terms of Cyb-Crg are converted into values in the CMY color space. In this case, normalized Wi(i=cmy) is calculated by using Yellow itself, C=(G+B)/2, and M=(R+B)/2. The compensating unit 160 may perform compensation by using the above-described method.
In Equation 3 above, Lavg(n) may be an average lightness of each respective region, and li(x,y,n) may be chromaticity of each respective pixel of the input photo image. li(x,y,n) is a resulting value obtained by converting the photo image input to the input unit 110 into a CMY image in the converting unit 130.
The image compensating apparatus 100 of
In
First, the saturation and contrast adjusting unit 170 applies a Gamma curve to a dark region (e.g., less than 0.5) in order to adaptively adjust contrast according to lightness of an image. The Gamma curve may be defined as:
γ=1+|0.5−meanI| (4)
The saturation and contrast adjusting unit 170 may apply different Gamma curves according to average lightness of an image. That is, contrast of the imaged may be increased by applying x1/r when the average lightness of the image is low, and by applying xr when the average lightness of the image is high.
The saturation and contrast adjusting unit 170 may improve saturation according to lightness of the image in an hue, saturation and intensity (HSI) color space, in order to increase saturation that is reduced by fading the image. Saturation in the range of 10 to 40% may be adjusted with respect to a lightness value. In this case, a default value may be determined through a subjective estimation test. For example, 20% may be determined as a default value.
As a result of improving saturation, an image portion such as a skin image may also become smoother than an image of which chromaticity is compensated only.
In general, chromaticity of an image captured by a camera varies according to chromaticity of a light source. Such a process of recognizing color by using light may be defined according to Equation 5 below.
Qi=∫Si(λ)R(λ)L(λ)dλ (5)
In Equation 5, “Qi” is recognized color, “Si” is a response function of human vision, “R” is reflectivity of an object, “L” is a light source, and “i” is each channel R, G, or B. An image captured by using light is determined based on an influence of a uniform light source and reflectivity of an object.
However, a fading degree of a printed image may vary according to the characteristics of dye, the characteristics of photographic paper, a light source, an environment, and the like. That is, even when portions of an image are affected by the same light source, the fading degree of the portions of the image may be different according to colors of the portions of the image, unlike a change in chromaticity when a uniform light source is used.
In order to analyze such characteristics of a faded image, a patch image may be printed using a photoprinter, such as Frontier 570 photo printer. A first printed image is kept in an indoor dark room, and a second printed image is exposed to sunlight outdoors for about 8 weeks to be faded. The results are illustrated in
As illustrated in
Referring to
F=I·E≅hours(I·E)
hours(I·E)≅hours(P·R·E·H·T+N) (6)
In Equation 6, F is a faded image, I is an original image, and E is a light source. In general, a model of the faded image may be expressed by the product the light source and the original image with time, as illustrated in an upper equation of Equation 6. In detail, the model may be expressed by dye characteristics P, humidity H, temperature T, and noise N, as illustrated in a lower equation of Equation 6. Also in a conventional method of compensating an image based on estimation of a light source, various parameters other than a light source and reflectivity of an object are considered, but are negligible. However, when a faded image is compensated, humidity, a temperature, and a type of light source as well as dye characteristics largely affect a physical change in dye, and thus are not negligible. However, since a faded image is exposed to various surroundings, it is difficult to estimate the humidity, temperature, and type of light source of the faded image. Thus, a similar compensating method to the original image may be applied to the faded image.
Thus, a method of compensating an image based on estimation of a light source of color recognition model of human vision according to Equation 5 may be used. That is, if a single uniform light source is used, variations of all hues of an image are uniform so that estimated chromaticity may be applied to all regions. However, as described above, different results corresponding to the faded image may be obtained. That is, even though an image is affected by a single uniform light source, variations of hues of a printed image are different according to dye characteristics and surrounding characteristics of the printed image.
In consideration with this point, as illustrated in
In the compensating method based on the L-Cyb-Crg color space, chromaticity of a faded imaged is considered when the faded image is digital-imaged, unlike in the color recognition model of human vision according to Equation 5, which is assumed in a conventional method of compensating an image based on estimation of a light source.
Color of a faded image is compensated under two assumptions. First, the same color of patches is changed to have the same color. The second assumption is gray world assumption. Based on the assumptions, chromaticity is divided based on the L-Cyb-Crg color space, chromaticity of each respective region is estimated based on the divided chromaticity, and then estimated chromaticity may be compensated.
In at least one exemplary embodiment, a faded image may be compensated by using a CMY color space instead of a conventional RGB color space. Accordingly, compensation the various dyes of the faded image may be achieved.
Thus, unpredictable variation as well as predictable variation may be compensated.
The chromaticity division-map may be generated in various ways.
Referring to
Then, a division map is generated based on the converted image (S1130).
More specifically, Cyb and Crg of each pixel included in the converted image are analyzed. In this case, when Cyb(x,y) is equal to or greater than 0, the pixel is determined to Yellow. When Cyb(x,y) is less than 0, the pixel is determined to Blue. When Crg(x,y) is equal to or greater than 0, the pixel is determined to Red. When Crg(x,y) is less than 0, the pixel is determined to Green. Thus, as illustrated in Table 1, a plurality of bit values may indicate regions of each respective pixel, and division-map may be generated of each respective pixel.
The graphs illustrated in
The above-described image compensating apparatus 100 may be installed or used in various types of image processing apparatuses. The image processing apparatus may be embodied as an apparatus to form an image on various recording media such as paper, a screen or a memory, for example, an image forming apparatus such as a printer, a scanner, a copier or a multifunctional printer, or a display apparatus such as a television (TV), a monitor, a laptop computer or a personal computer (PC).
According to various exemplary embodiments of the inventive concept, the image processing apparatus may compensate an image by using a compensating method based on the L-Cyb-Crg color space only, or alternatively, may compensate an image by using the compensating method based on the L-Cyb-Crg color space as well as a conventional method based on estimation of a light source.
That is, the image processing apparatus may show various images by using the above-described L-Cyb-Crg color space and at least one conventional method, and may allow a user to select a desired image. Examples of conventional methods may include GWA, WR, CGWR, or the like. Thus, the user may directly preview images being compensated in various ways, and may select a desired compensation method. Accordingly, the user may obtain a compensating method to which users' opinions are reflected. The Gray World Assumption (GWA) indicates a method of correcting chromaticity based on the assumption that the average chromaticity value of an image for each channel corresponds to gray. That is, according to the GWA, the average chromaticity value of an image for each channel is calculated, and a compensation coefficient is calculated based on the result of the calculation. White patch retinex (WR) indicates a method of correcting chromaticity based on the assumption that RGB values of a pixel having a maximum brightness value of an image is the chromaticity of a light source. Combining gray world and white patch retinex (CGWR) indicates a method of correcting an image by modeling an equation that can satisfy both the GWA and the WR assumption and calculating a coefficient of the equation to compensate for the image.
The photo region may be manually selected from the obtained image by a user.
Thus, when the photo image is selected, the chromaticity of the photo image may be compensated by using a basic compensating method including, but not limited to, a conventional chromaticity compensating method based on an estimation of a light source. Then, the resulting image of which chromaticity is compensated is displayed (S1530).
When the user previews the resulting image compensated by using the basic compensating method, if the user is satisfied with the resulting image, at least one operation of printing, transmitting, and storing may be performed on the resulting image (S1560, S1570, and S1580). The user may select a printing menu, a transmitting menu, a storing menu, or the like, on an interface window. According to the selected menu, the resulting image itself may be printed, or alternatively, the resulting image may be transmitted to an external apparatus or may be stored in a memory.
When the user is not satisfied with the resulting image of which chromaticity is compensated by using the basic compensating method, the user may select a menu to obtain compensated image by using a different compensating method (S1540). The different compensating method may include, but is not limited to, the L-Cyb-Crg color space compensation method described in detail above. In response to selecting the different compensation method, the basic compensating setting mentioned above may be changed so that the different compensating method may be performed (S1550).
Referring to
As mentioned above, light and dark regions of an image may be affected differently by long periods of exposed light. For example, the chroma of a light region may not be changed by light exposure, while the chroma of dark regions may be significantly affected. In addition, improving saturation of a skin image may also become smoother than an image of which chromaticity is compensated only. Further, the fading degree of the portions of the image may be different according to colors of the portions of the image. Accordingly, dyes used for dark skin regions of persons included in an image may have a higher degree of fading than dyes used for light skin regions.
Accordingly, at least one exemplary embodiment illustrated in
In another exemplary embodiment, a user may select one or more skin regions of an image displayed in a selected photo 190, as discussed above. In response to the selection of the skin region, the image processing apparatus may determine the race of the image corresponding to the selected skin region, and may automatically compensate the color of the skin region based on an ideal color compensation method corresponding to the detected race.
Referring to
The interface unit 310 transmits and receives data to and from various storage media, or various devices that are installed inside or outside the image processing apparatus 300. The image processing apparatus 300 may be connected to a digital camera, a memory stick, a memory card, or the like through the interface unit 310 so as to receive a photo image. That is, the photo image may be at least one of an image captured by an imaging device installed in the image processing apparatus 300, an image transmitted from an external device, and an image read from a recording medium installed inside or outside the image processing apparatus 300.
The display unit 340 may display an image to select a plurality of methods of compensating chromaticity. According to another exemplary embodiment of the inventive concept, as illustrated in
The chromaticity compensating unit 370 compensates chromaticity of the photo image by using a method of compensating chromaticity that is selected in the selection image. In detail, the method of compensating chromaticity may include the compensating method based on an L-Cyb-Crg color space, and at least one method based estimation of a light source.
In this case, the chromaticity compensating unit 370 may include the same structure as that of the image compensating apparatus 370 of
The controller 320 may control the display unit 340 to display the photo image that is compensated by the chromaticity compensating unit 370.
The communication unit 350 may transmit the final image to an external device, or may receive an original image from an external device through network communication. Additionally, the communicating unit 350 may communicate with a remote communication unit 350′ included with a remotely located image processing apparatus 300′ via a network, as discussed further below.
The image forming unit 360 performs an image forming operation of printing the final image on paper and/or recording the final image on a recoding medium. More specifically, when a user selects a printing menu, the image forming unit 360 prints the final image itself on paper. The image forming unit 360 may be configured in various ways according to a printing method. If the image forming unit 360 is of a laser type, the image forming unit 360 may include a charging unit, an exposing unit, a photoconductor, a developing unit, a transferring unit, a fixing unit, or the like. Such configuration of the image forming unit 360 is known in the art, and thus will be omitted herein.
The storage unit 330 may store a program to compensate chromaticity, saturation, contrast, or the like, and may store an original image to be compensated. In addition, according to users' selection, the controller 320 may store the final image in the storage unit 330.
The communication unit 350 allows the image processing apparatus 300 to communicate over a network. Referring to
The server module 500 may further include a storage unit 502 that stores one or more original photo images, and a compensation module 504, which compensates fading of a faded photo image, as discussed in detail above. The compensation module 504 may be in communication with the storage unit 502 to receive a faded image photo among the original photo images stored in the storage unit 502.
The image processing system 102 further includes a first image processing apparatus 300 and a second image processing apparatus 300′ remotely located from the first processing apparatus 300. The first image processing apparatus 300 and the second image processing apparatus 300′ may be in communication with the cloud network 380, as discussed in greater detail below.
Each of the first and second image process apparatuses 300/300′ includes a controller 320/320′, a display 340/340′, and communication units 350/350. The communication unit 350 communicates with the remote communication unit 350′ included with a remotely located image processing apparatus 300′ via a network 380. Each of the first and second image processing apparatuses 300/300′ may include a web-browser interface that controls the compensation module 504 via the internet. The web-browser may generate control signals based on various Internet environment program languages including, but not limited to, Java.
The cloud network 380 allows a first user to collaborate with a remotely located second user in real-time to compensate fading of a selected faded image 390. For example, a first user may operate a first image processing apparatus 300, which includes a first interface window 340. The first interface window 340 may operate as discussed in detail above. A second user remotely located from the first user may operate a remotely located image processing apparatus 300′, which includes a second interface window 340′.
The first and/or second user may select a desired faded photo 390 to be compensated using the fading compensation methods discussed in detail above. For example, the first and/or second controller 320/320′ may generate a control signal that selects a faded photo 390 from the storage unit 502 and inputs the faded photo 390 to the compensation module 504. When the faded photo 390 is selected, the faded photo 390 is simultaneously displayed in both the first interface window 340 and the second interface window 340′. Accordingly, a first user and a remotely located second user may collaborate with one another in real-time to generate a finally processed photo that compensates the fading of the faded photo 390.
More specifically, the cloud network 380 allows a user of the first image processing apparatus 300 and a user of a remotely located second image processing apparatus 300′ to dynamically apply fading compensation methods, as discussed above, to the selected faded photo 390 simultaneously displayed on each of the first interface window 340 and the second interface window 340′. Referring to
Additionally, each of the first and second interface windows 340, 340′ may include an indicator 400, which indicates to the first and second users that a compensation of the faded photo 390 is currently in progress. For example, the indicator 400 may be displayed in first interface window 340 in response to the second user completing a fading compensation of the faded photo 390 via the remote second interface window 340′ of the remotely located image processing apparatus 300′. In another exemplary embodiment, the indicator 400 may be displayed in the first interface window 340 while the second user in in the process of compensating the faded photo 390. When the second user has completed the fading compensation using the remotely located second image processing unit 300′, the indicator 400 disappears from the first interface window 340 of the first processing apparatus 300. Thus, the indicator 400 allows the first user and the remotely located second user to easily determine when to begin a respective compensation on the faded photo 390, without interfering with each other.
In this case, the compensating method based on an L-Cyb-Crg color space, and at least one method based estimation of a light source may be displayed on the selection image. The user may select a desired compensating method. In addition, compensating methods may be displayed by displaying a name of a compensating method, or by displaying a resulting image that is compensated by using a corresponding compensating method.
In addition, the compensating method based on an L-Cyb-Crg color space may be a method including performing Gaussian filtering on a photo image, converting the photo image into the L-Cyb-Crg color space, generating a chromaticity division-map based on a Cyb-Crg color coordinate value, calculating average chromaticity of each respective region by using the chromaticity division-map, and then compensating chromaticity of each respective region of the photo image by using a C-M-Y image of the photo image and the calculated chromaticity of each respective region. The compensating method based on an L-Cyb-Crg color space has been described with reference to
Referring to
When the program is executed, an interface window to compensate chromaticity is displayed (S2620).
Thus, when a photo image to be compensated is selected in the interface window, the selected photo image is displayed, and a region to be compensated is selected in the photo image (S2630).
When the region to be compensated is selected, the selected region may be compensated by using a basic compensating method (S2640). Then, an image that is compensated by using the basic compensating method is displayed (S2650).
Thus, when a menu to check a result obtained according to a different compensating method is selected (S2660), at least one image that is compensated by using at least one different compensating method is displayed (S2670).
Then, a menu to adjust saturation and contrast is displayed, and the saturation and contrast are adjusted according to the menu (S2680).
When the saturation and contrast are completely adjusted, a final image is displayed (S2690). An image including at least one processing command about the final image is displayed. An operation is performed according to a menu selected in the image. That is, an operation such as printing, transmission, storing, or the like may be performed.
According to the above-described exemplary embodiments, a photo image may be converted into the L-Cyb-Crg color space, and a chromaticity division-map is generated. In at least one exemplary embodiment of the present general inventive concept, the photo image may be converted into an RGB color space, or other color spaces, and the chromaticity division-map may be generated. In this case, an operation of converting a photo image on which Gaussian filtering is performed into a C-M-Y color space may be omitted.
As described above, a faded image may be recovered to an original image as possible by appropriately compensating the faded image. Further, compensation of nonuniform chromatic fading in a photographic image may be achieved.
According to the above-described exemplary embodiments of the inventive concept, methods of compensating and processing an image may be stored in various types of recording media, and may be embodied by a program code that is executed by a central processing unit (CPU) included in an electronic device.
In detail, the program code to execute the methods of compensating and processing an image may be stored in various types of recording media that are capable of being read by a reader, such as a random access memory (RAM), a flash memory, a read only memory (ROM), an erasable programmable ROM (EPROM), an electronically erasable and programmable ROM (EEPROM), a register, a hard disk, a removable disk, a memory card, a universal serial bus (USB) memory, CD-ROM, or the like.
Although a few exemplary embodiments of the present general inventive concept have been illustrated and described, it will be appreciated by those skilled in the art that changes may be made in these exemplary embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the appended claims and their equivalents.
Kim, Hyun-Cheol, Cho, Min-ki, Oh, Hyun-soo, Kim, Dae-Chul, Kim, Kyeong-man, Ha, Yeong-ho, Song, Eun-Ah, Kyung, Wang-jun
Patent | Priority | Assignee | Title |
10027903, | Jul 16 2013 | SAMSUNG ELECTRONICS CO , LTD | Method of arranging image filters, computer-readable storage medium on which method is stored, and electronic apparatus |
10218880, | Feb 06 2017 | FMV INNOVATIONS, LLC | Method for assisted image improvement |
10332485, | Nov 17 2015 | EIZO Corporation | Image converting method and device |
Patent | Priority | Assignee | Title |
5796874, | Apr 30 1996 | KODAK ALARIS INC | Restoration of faded images |
5809202, | Nov 09 1992 | Matsushita Electric Industrial Co., Ltd. | Recording medium, an apparatus for recording a moving image, an apparatus and a system for generating a digest of a moving image, and a method of the same |
6320676, | Feb 04 1997 | FUJIFILM Corporation | Method of predicting and processing image fine structures |
6377702, | Dec 08 1999 | Sony Corporation; Sony Electronics, Inc. | Color cast detection and removal in digital images |
6594384, | Nov 15 1999 | SAMSUNG ELECTRONICS CO , LTD | Apparatus and method for estimating and converting illuminant chromaticity using perceived illumination and highlight |
6853389, | Apr 26 1999 | Canon Kabushiki Kaisha | Information searching apparatus, information searching method, and storage medium |
7130466, | Dec 21 2000 | KYNDRYL, INC | System and method for compiling images from a database and comparing the compiled images with known images |
7272266, | Sep 30 2003 | Benq Corporation | Method for restoring faded photographs and film |
7443552, | Jun 24 2002 | KODAK ALARIS INC | Process of spatial color restoration of an image |
7889280, | Sep 06 2006 | Canon Kabushiki Kaisha | Image processing apparatus and method thereof |
8135239, | Feb 01 2006 | Sony Corporation | Display control apparatus, display control method, computer program, and recording medium |
8155454, | Jul 20 2006 | Qualcomm Incorporated | Method and apparatus for encoder assisted post-processing |
20020036783, | |||
20020044691, | |||
20030012433, | |||
20030039378, | |||
20050078878, | |||
20050157908, | |||
20050237586, | |||
20050275911, | |||
20060072799, | |||
20060177128, | |||
20080055473, | |||
20080075388, | |||
20080094515, | |||
20080253662, | |||
20090274386, | |||
20090279840, | |||
20100250601, | |||
20100322518, | |||
20110001840, | |||
20110007088, | |||
20110311112, | |||
20110317928, | |||
20120014562, | |||
20120050563, | |||
20120051730, | |||
20120057775, | |||
20120251002, | |||
20120256964, | |||
20120257826, | |||
20120262473, | |||
20120290589, | |||
20120294496, | |||
20130028521, | |||
20130129160, | |||
20130136352, | |||
20130202163, | |||
20130243278, | |||
20130322746, | |||
20140198234, | |||
20140233105, | |||
CN1604615, | |||
GB2430829, | |||
JP2005123790, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 07 2011 | KYUNG, WANG-JUN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | KIM, DAE-CHUL | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | HA, YEONG-HO | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | SONG, EUN-AH | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | OH, HYUN- SOO | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | CHO, MIN-KI | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | KIM, HYUN-CHEOL | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 07 2011 | KIM, KYEONG-MAN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 027434 | /0006 | |
Dec 22 2011 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / | |||
Nov 04 2016 | SAMSUNG ELECTRONICS CO , LTD | S-PRINTING SOLUTION CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 041852 | /0125 | |
Mar 16 2018 | S-PRINTING SOLUTION CO , LTD | HP PRINTING KOREA CO , LTD | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 047370 | /0405 | |
Mar 16 2018 | S-PRINTING SOLUTION CO , LTD | HP PRINTING KOREA CO , LTD | CORRECTIVE ASSIGNMENT TO CORRECT THE DOCUMENTATION EVIDENCING THE CHANGE OF NAME PREVIOUSLY RECORDED ON REEL 047370 FRAME 0405 ASSIGNOR S HEREBY CONFIRMS THE CHANGE OF NAME | 047769 | /0001 | |
Jun 11 2019 | HP PRINTING KOREA CO , LTD | HP PRINTING KOREA CO , LTD | CHANGE OF LEGAL ENTITY EFFECTIVE AUG 31, 2018 | 050938 | /0139 | |
Aug 26 2019 | HP PRINTING KOREA CO , LTD | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | CONFIRMATORY ASSIGNMENT EFFECTIVE NOVEMBER 1, 2018 | 050747 | /0080 |
Date | Maintenance Fee Events |
Jul 14 2016 | ASPN: Payor Number Assigned. |
Jul 22 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Oct 16 2023 | REM: Maintenance Fee Reminder Mailed. |
Apr 01 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 23 2019 | 4 years fee payment window open |
Aug 23 2019 | 6 months grace period start (w surcharge) |
Feb 23 2020 | patent expiry (for year 4) |
Feb 23 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 23 2023 | 8 years fee payment window open |
Aug 23 2023 | 6 months grace period start (w surcharge) |
Feb 23 2024 | patent expiry (for year 8) |
Feb 23 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 23 2027 | 12 years fee payment window open |
Aug 23 2027 | 6 months grace period start (w surcharge) |
Feb 23 2028 | patent expiry (for year 12) |
Feb 23 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |