A solid-state image sensor includes pixels each including first and second photoelectric conversion portions provided in a substrate, the second photoelectric conversion portion having lower sensitivity than the first photoelectric conversion portion; a barrier region provided between the first and second photoelectric conversion portions; a waveguide provided on a light-entrance side of the substrate and including a core and a cladding; and a protective layer provided between the waveguide and the substrate. Seen in a direction perpendicular to the substrate, a center of an exit face of the core is on a first-photoelectric-conversion-portion side with respect to a center of the barrier region in each pixel in a central part of a pixel area. A standard deviation of a refractive-index distribution of the protective layer in a region directly below the exit face of the core is 0.1 or smaller in an in-plane direction of the substrate.
|
1. A solid-state image sensor comprising:
pixels provided in a pixel area,
the pixels each including
a first photoelectric conversion portion and a second photoelectric conversion portion that are provided in a substrate, the second photoelectric conversion portion having lower sensitivity than the first photoelectric conversion portion;
a barrier region provided between the first photoelectric conversion portion and the second photoelectric conversion portion;
a waveguide provided on a light-entrance side of the substrate and including a core and a cladding; and
a protective layer provided between the waveguide and the substrate,
wherein, when a surface of the substrate is seen in a direction perpendicular to the substrate, a center of an exit face of the core is positioned on a first-photoelectric-conversion-portion side with respect to a center of the barrier region at the surface of the substrate in each of pixels that are provided in a central part of the pixel area, and
wherein a standard deviation of a distribution of refractive index of the protective layer in a region directly below the exit face of the core is 0.1 or smaller in an in-plane direction of the substrate.
2. A solid-state image sensor according to
3. A solid-state image sensor according to
4. A solid-state image sensor according to
5. A solid-state image sensor according to
6. A solid-state image sensor according to
7. A solid-state image sensor according to
8. A solid-state image sensor according to
9. A solid-state image sensor according to
10. A solid-state image sensor according to
11. A solid-state image sensor according to
12. A solid-state image sensor according to
13. A solid-state image sensor according to
S1×T1=S2×T2, and C1/(S1×T1)=C2/(S2×T2), where S1, C1, and T1 denote sensitivity, capacity, and exposure time, respectively, of the first photoelectric conversion portion, and S2, C2, and T2 denote sensitivity, capacity, and exposure time, respectively, of the second photoelectric conversion portion.
14. An image pickup apparatus comprising:
the solid-state image sensor according to
15. An image pickup apparatus according to
|
1. Field of the Invention
The present invention relates to solid-state image sensors and particularly to a solid-state image sensor included in an image pickup apparatus such as a digital camera.
2. Description of the Related Art
In recent years, there has been an increasing demand for images taken by apparatuses, such as a digital camera, with wider dynamic ranges. In response to such a demand, a solid-state image sensor is proposed by Japanese Patent Laid-Open No. 2004-363193, in which a plurality of photoelectric conversion portions having different areas are provided in each pixel, so that two kinds of pixel signals, which are a high-sensitivity signal and a low-sensitivity signal, are acquired. With a combination of the two signals, the dynamic range is widened.
There has been another increasing demand for a camera capable of taking a moving image and a still image simultaneously. In general, to acquire a smooth moving image, the moving image may be taken with a period of exposure time that substantially corresponds to the frame rate of the reading by a solid-state image sensor. In contrast, to take a still image, the period of exposure time may be set in accordance with the speed at which the object moves. Hence, to take a still image and a moving image simultaneously, two kinds of pixel signals based on different periods of exposure time need to be acquired.
In Japanese Patent Laid-Open No. 2004-120391, a solid-state image sensor is disclosed that includes a plurality of photoelectric conversion elements (equivalent to the photoelectric conversion portions according to Japanese Patent Laid-Open No. 2004-363193) provided in each pixel and being based on different periods of exposure time so that a moving-image signal and a still-image signal can be acquired simultaneously. Photoelectric conversion elements for a relatively short period of exposure time each have a relatively large area, whereas photoelectric conversion elements for a relatively long period of exposure time each have a relatively small area. Furthermore, the sensitivity of photoelectric conversion elements for a moving image is different from the sensitivity of photoelectric conversion elements for a still image.
Note that “sensitivity of a photoelectric conversion portion” is defined by the ratio of the amount of charge accumulated in the photoelectric conversion portion to the quantity of light that is incident on the pixel per unit time.
In each of the solid-state image sensors disclosed by Japanese Patent Laid-Open No. 2004-363193 and Japanese Patent Laid-Open No. 2004-120391, a desired image is taken with a plurality of photoelectric conversion portions that are provided in each of pixels and having different levels of sensitivity. In each of the devices, light is condensed by a microlens provided on the surface of the pixel, whereby the light is guided to each of the photoelectric conversion portions. Hence, the quantity of light that is incident on each of the photoelectric conversion portions varies with the angle of incidence of the light on the pixel. Therefore, the photoelectric conversion portions each receive only a portion of the light that is emitted from a specific part of the exit pupil of an image pickup lens used. Consequently, a blurred image of an object that is out of focus may be distorted, resulting in a deterioration of image quality.
The present invention is to suppress the deterioration of image quality by reducing the angular dependence of the sensitivity of each of a plurality of photoelectric conversion portions provided in each pixel and having different levels of sensitivity.
A solid-state image sensor according to a first aspect of the present invention includes pixels provided in a pixel area. The pixels each include a first photoelectric conversion portion and a second photoelectric conversion portion that are provided in a substrate, the second photoelectric conversion portion having lower sensitivity than the first photoelectric conversion portion; a barrier region provided between the first photoelectric conversion portion and the second photoelectric conversion portion; a waveguide provided on a light-entrance side of the substrate and including a core and a cladding; and a protective layer provided between the waveguide and the substrate. When a surface of the substrate is seen in a direction perpendicular to the substrate, a center of an exit face of the core is positioned on a first-photoelectric-conversion-portion side with respect to a center of the barrier region at the surface of the substrate in each of pixels that are provided in a central part of the pixel area. A standard deviation of a distribution of refractive index of the protective layer in a region directly below the exit face of the core is 0.1 or smaller in an in-plane direction of the substrate.
An image pickup apparatus according to a second aspect of the present invention includes the solid-state image sensor according to the first aspect that is provided in a housing.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Embodiments of the solid-state image sensor according to the present invention will now be described with reference to the drawings, wherein like elements or elements having equivalent functions are denoted by like reference numerals, and redundant description thereof is omitted.
Pixels
Pixels 101 refer to pixels provided in a central part 102 of the pixel area 103. Herein, the pixels 101 provided in the central part 102 refer to pixels whose centroids are positioned in the central part 102.
The central part 102 refers to an area within a predetermined distance from the center of the pixel area 103. The predetermined distance is preferably ¼ of the length of the diagonal of the pixel area 103 or shorter, or more preferably 1/20 of the length of the diagonal of the pixel area 103 or shorter.
In the solid-state image sensor 100 illustrated in
The pixel 101 includes, in order from a light-entrance side thereof, a waveguide 110 including a core 111 and a cladding 112, a protective layer 116, and the substrate 120. The substrate 120 includes a first photoelectric conversion portion 121, a second photoelectric conversion portion 122, and a barrier region 123 provided between the photoelectric conversion portions 121 and 122.
Photoelectric Conversion Portions and Barrier Region
The photoelectric conversion portions 121 and 122 are formed by producing potential variations in the substrate 120 by ion implantation or the like. The substrate 120 is made of silicon or the like that absorbs light having a wavelength band to be detected. The barrier region 123 has a potential barrier that suppresses the occurrence of charge crosstalk between the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122. As illustrated in
The size of the potential barrier in the barrier region 123 may be determined with consideration for the permissible amount of charge crosstalk between the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122. To acquire pixel signals for the respective photoelectric conversion portions 121 and 122 independently of each other, the potential barrier may be made high so that the amount of charge crosstalk is reduced. More specifically, the potential barrier in the barrier region 123 may have a height greater than or equal to the height of potential barriers (denoted by reference numerals 128 and 129 in
To produce potential variations in the pixel 101 as illustrated in
The first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 do not necessarily need to be arranged side by side in the X direction as illustrated in
Protective Layer
The protective layer 116 is provided for reducing the damage to the photoelectric conversion portions 121 and 122 during a manufacturing process and for preventing impurities from entering the photoelectric conversion portions 121 and 122 from other members such as wiring lines 125. In addition, the protective layer 116 may have another function such as an anti-reflection function of suppressing the reflection of light that is incident on the photoelectric conversion portions 121 and 122 from the core 111. Moreover, the protective layer 116 may include a plurality of layers that are stacked in the direction perpendicular to the surface of the substrate 120.
Waveguide
The waveguide 110 is formed of different materials that are combined such that the core 111 has a higher refractive index than the cladding 112. The materials for the core 111 and the cladding 112 can be selected from inorganic materials such as silicon oxide, silicon nitride, silicon oxynitride, silicon carbide, and borophosphosilicate glass (BPSG), and organic materials such as polymer and resin.
A broken line 111E illustrated in
Light that is propagated in the waveguide 110 concentrates on the core 111. Therefore, the quantity of light that is incident on the first photoelectric conversion portion 121 provided relatively near the center 113 of the exit face 111E of the core 111 is larger than the quantity of light that is incident on the second photoelectric conversion portion 122 provided relatively far from the center 113 of the exit face 111E of the core 111. That is, shifting the center 113 of the exit face 111E of the core 111 toward the first-photoelectric-conversion-portion side with respect to the center 124 of the barrier region 123 at the surface of the substrate 120 makes the sensitivity of the first photoelectric conversion portion 121 higher than the sensitivity of the second photoelectric conversion portion 122.
The cladding 112 is provided with wiring lines 125 that transmit driving signals for setting periods of exposure time of the respective photoelectric conversion portions 121 and 122 and read the signals acquired by the photoelectric conversion portions 121 and 122. The driving signals that are transmitted from the peripheral circuits 104 to the pixel 101 through the wiring lines 125 activate the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 on the basis of respective desired periods of exposure time.
To summarize, in the pixel 101 included in the solid-state image sensor 100 according to the general embodiment of the present invention, light that is incident on the pixel 101 is guided to each of the photoelectric conversion portions 121 and 122 by the waveguide 110 provided in such a manner as to be shifted with respect to the center 124 of the barrier region 123 at the surface of the substrate 120. Employing such a configuration makes the angular dependence of the sensitivity of each of a plurality of photoelectric conversion portions 121 and 122, which have different levels of sensitivity, lower than that observed in the related-art solid-state image sensor in which light that is incident on the pixel is guided to each of the photoelectric conversion portions by using a microlens. Comparison with the case of the related-art solid-state image sensor will be given below.
Angular Dependence of Related-Art Solid-State Image Sensor
As can be seen from
Position of Waveguide and Angular Dependence
Now, a case where the microlens as the cause for the angular dependence of the pixel is omitted will be discussed. The microlens provided on the light-emission side of the pixel condenses light incident on the pixel from the outside and guides the light to the first photoelectric conversion portion and to the second photoelectric conversion portion provided in the pixel. Hence, if the microlens is simply omitted, particularly, a ray of light that is obliquely incident on the pixel travels straightly, without being condensed, and a portion thereof enters an adjacent pixel. Such a situation increases so-called crosstalk between pixels. Consequently, image quality may be deteriorated.
In contrast, the pixel 101 of the solid-state image sensor 100 according to the general embodiment of the present invention includes the waveguide 110 whose center 113 of the exit face 111E of the core 111 is shifted toward the first-photoelectric-conversion-portion side with respect to the center 124 of the barrier region 123 at the surface of the substrate 120, whereby light is guided to the photoelectric conversion portions 121 and 122, instead of condensing the light by using the microlens. The light that enters the waveguide 110 is emitted after being coupled with a plurality of waveguide modes. Therefore, the intensity distribution of the light at the exit face 111E of the waveguide 110 is more even than that of the light condensed by the microlens. Consequently, the angular dependence of the sensitivity of each of the photoelectric conversion portions 121 and 122 is reduced. Furthermore, a ray of light that is obliquely incident on the pixel 101 is efficiently guided to the photoelectric conversion portions 121 and 122 by the waveguide 110. Therefore, the crosstalk between pixels 101 is also reduced.
As is obvious from the comparison between
Amount of Shift of Waveguide from Viewpoint of Sensitivity Ratio
The amount of shift of the center 113 of the exit face 111E of the core 111 with respect to the center 124 of the barrier region 123 may be changed in accordance with the required sensitivity ratio between the photoelectric conversion portions 121 and 122. To make the sensitivity of the first photoelectric conversion portion 121 satisfactorily higher than the sensitivity of the second photoelectric conversion portion 122, a second-photoelectric-conversion-portion-side end 114 of the exit face 111E of the core 111 may be positioned on the first-photoelectric-conversion-portion side with respect to a center 126 of the second photoelectric conversion portion 122 as illustrated in
Amount of Shift of Waveguide from Viewpoint of Angular Dependence of Sensitivity
As illustrated in
Moreover, the second-photoelectric-conversion-portion-side end 114 of the exit face 111E of the core 111 may be positioned on the second-photoelectric-conversion-portion side (+X side) with respect to a boundary between the barrier region 123 and the first photoelectric conversion portion 121. If the second-photoelectric-conversion-portion-side end 114 of the exit face 111E of the core 111 is positioned on the −X side with respect to the boundary between the barrier region 123 and the first photoelectric conversion portion 121, substantially all rays of light emitted from the waveguide 110 enter the first photoelectric conversion portion 121. Accordingly, the sensitivity of the second photoelectric conversion portion 122 becomes too low. Consequently, the quality of an image acquired by the second photoelectric conversion portion 122 may be deteriorated.
That is, the best position of the second-photoelectric-conversion-portion-side end 114 of the exit face 111E of the core 111 is a position on the second-photoelectric-conversion-portion side with respect to the boundary between the barrier region 123 and the first photoelectric conversion portion 121 and on the first-photoelectric-conversion-portion side with respect to the boundary 127 between the barrier region 123 and the second photoelectric conversion portion 122.
As can be seen from
The reason for the lower angular dependence of the sensitivity of each of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 in the arrangement illustrated in
That is, the angular dependence of the sensitivity of each of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 can further be made lower in the case where the end 114 is positioned on the −X side with respect to the boundary 127 than in the case where the end 114 is positioned on the +X side with respect to the boundary 127.
As can be seen from the comparison between the case illustrated in
Supplementary Explanation
The areas of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 may be different as illustrated in
If the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 are arranged side by side in the XY plane and in a direction that is at an angle β (>0°) with respect to the X axis, the center 113 of the exit face 111E of the core 111 is shifted along the surface of the substrate 120 with respect to the center 124 of the barrier region 123 and in a direction that is at the angle β with respect the X axis.
Protective Layer and Angular Dependence
Now, the thickness of the protective layer 116 will be described.
The optical distance L increases in order of
That is, the distance between the exit face 111E of the core 111 and a light-entrance-side surface of the substrate 120 may be short. Specifically, the optical distance L between the exit face 111E of the core 111 and the surface of the substrate 120 may be set to twice the wavelength of light that is sensible by the photoelectric conversion portions 121 and 122 or shorter. Herein, the case where the light is sensible by the photoelectric conversion portions 121 and 122 refers to a case where 5% or more of the light that is incident on the pixel 101 is absorbed by a combination of the photoelectric conversion portions 121 and 122.
The reason for the above is as follows. As described above, a ray that is incident on the pixel 101 at an angle within the angle range 140 is selectively coupled with a plurality of specific waveguide modes 142. Then, the ray exits from the exit face 111E of the core 111 and is coupled with a specific propagation mode in the protective layer 116. Hence, if the distance between the exit face 111E of the core 111 and the surface of the substrate 120 is long, the ray exited from the exit face 111E of the core 111 is propagated within the protective layer 116 before reaching the surface of the substrate 120. A portion of the ray propagated in such a manner enters the second photoelectric conversion portion 122 and is converted into electric charge. Most of such electric charge is accumulated in the second photoelectric conversion portion 122. Therefore, the sensitivity of the second photoelectric conversion portion 122 for the ray that is incident on the pixel 101 at a specific angle is increased, resulting in angular dependence of sensitivity.
Comparing
The distribution of refractive index of the protective layer 116 in a region directly below the exit face 111E of the core 111 may be even in the in-plane direction of the substrate 120 (in the XY plane). If the refractive index of the protective layer 116 varies in the in-plane direction, the propagation mode of the ray having exited from the waveguide 110 with a nearly even distribution is changed by the refractive-index variation in the XY direction before the ray reaches the surface of the substrate 120. Consequently, the angular dependence of the sensitivity of each of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 is increased. The state where the distribution of refractive index is even refers to a state where the standard deviation of the distribution of refractive index is 0.1 or lower.
The distribution of refractive index of the protective layer 116 is obtained by measuring the refractive index of the protective layer 116 in the region directly below the exit face 111E of the core 111 at five or more points on a line connecting the respective centroids of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122, including a point on the barrier region 123 and a point on the first photoelectric conversion portion 121. The points of measurement are all at the same distance from the surface of the substrate 120 in the thickness direction. Specifically, if the protective layer 116 includes a plurality of layers, the refractive index is measured within one layer.
The refractive index is measurable with an interferometer, an ellipsometer, or the like. Alternatively, the composition of the material for the protective layer 116 may first be analyzed by Fourier-transform infrared spectroscopy (FTIR), x-ray diffractometry (XRD), mass analysis, or the like, and the result may be converted into refractive index.
Shape and Position of Waveguide
The shape of the waveguide 110 according to the general embodiment of the present invention is not limited to the shape illustrated in
Furthermore, “the center 115 of the entrance face of the core 111” refers to the centroid of a plan-view shape, seen in the Z direction, of the entrance face of the core 111, as with “the center 113 of the exit face 111E of the core 111.”
The structure illustrated in
The plan-view shape of the core 111 is not limited to the circular shape as illustrated in
Microlens
A microlens may be provided on the light-entrance side with respect to the waveguide 110. If a microlens is provided, light that is incident on the pixel 101 can be efficiently guided to the waveguide 110. Note that, if a microlens is provided, the angular dependence of the sensitivity of a region between the photoelectric conversion portions 121 and 122 is increased. Therefore, from the viewpoint of reducing the angular dependence, the microlens is not necessary. However, even if a microlens is provided, the angular dependence of each of the photoelectric conversion portions 121 and 122 is lower than that observed in the related-art solid-state image sensor, because of the following reason.
If no waveguide is provided between the microlens and the photoelectric conversion portions as in the related-art solid-state image sensor, the focusing effect of the microlens directly affects the angular dependence of the sensitivity of the photoelectric conversion portions. In contrast, if a waveguide is provided between the microlens and the photoelectric conversion portions, light transmitted through the microlens is incident on the entrance face of the waveguide. The light thus incident on the entrance face of the waveguide is propagated in the waveguide while being coupled with a plurality of waveguide modes, and is emitted toward the photoelectric conversion portions. Therefore, the distribution of light intensity is more even at the exit face of the waveguide than at the entrance face of the waveguide. That is, the distribution of light intensity can be made more even than in the case of the related-art solid-state image sensor in which light that is incident on the pixel is condensed on the photoelectric conversion portions while being transmitted only through the microlens. Hence, according to the general embodiment of the present invention, even if a microlens is provided on the light-entrance side with respect to the waveguide 110, the angular dependence of the photoelectric conversion portions 121 and 122 can be made lower than that observed in the related-art solid-state image sensor.
Microlens Having Different Levels of Refractive Power in Different Directions
As illustrated in
If the refractive power of the microlens 117 in a first direction (along line A-A) in which the line connecting the center of the first photoelectric conversion portion 121 and the center of the second photoelectric conversion portion 122 to each other extends is high, the distribution of light intensity in the first direction at the entrance face of the waveguide 110 greatly depends on the angle of incidence of the light on the pixel 101 in the first direction. As the first-direction distribution of light intensity at the entrance face of the waveguide 110 becomes more uneven, the first-direction distribution of intensity of the light exiting from the waveguide 110 becomes more uneven. Accordingly, the ratio between the quantities of light that enters the respective photoelectric conversion portions 121 and 122 changes. Thus, if the refractive power of the microlens in the first direction is high, the angular dependence of each of the photoelectric conversion portions 121 and 122 in the first direction is high.
In contrast, the refractive power of the microlens 117 in a second direction (along line B-B) perpendicular to the first direction at the surface of the substrate 120 (in the XY plane) only affects the second-direction distribution of intensity of the light exiting from the waveguide 110. The first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 are provided side by side in the first direction. Therefore, even if the intensity distribution is uneven in the second direction, the change in the ratio between the quantities of light that enters the respective photoelectric conversion portions 121 and 122 is small.
In view of the above, the refractive power of the microlens 117 in the X direction that greatly affects the angular dependence of sensitivity may be reduced, whereas the refractive power of the microlens 117 in the Y direction that little affects the angular dependence of sensitivity may be increased, considering the sensitivity of the photoelectric conversion portions 121 and 122. That is, in the XY plane, the microlens 117 may have a lower refractive power in the first direction in which the line connecting the center of the first photoelectric conversion portion 121 and the center of the second photoelectric conversion portion 122 are connected to each other than in the second direction that is perpendicular to the first direction. Particularly, if the focal point of the microlens 117 in the second direction is defined near the entrance face of the core 111, the sensitivity of the photoelectric conversion portions 121 and 122 is maximized.
In each of the cases illustrated in
The above description all concerns a front-side-illuminated solid-state image sensor in which the wiring lines 125 are provided on the same side of the substrate 120 as the waveguide 110. Alternatively, the solid-state image sensor may be of a back-side-illuminated type in which the wiring lines 125 are provided on the other side of the substrate 120 across from the waveguide 110. If the present invention is applied to a back-side-illuminated solid-state image sensor, the layout of the waveguide 110 and the layout of the wiring lines 125 can be determined independently of each other. Therefore, the manufacturing process is simplified. Particularly, if a part of the waveguide 110 extends over an adjacent pixel 101 as illustrated in
Arrangement of Pixels in Pixel Area
In the case where a plurality of pixels 101 are provided in the pixel area 103 of the solid-state image sensor 100, the arrangement of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 in a single pixel 101 may be the same for all of the plurality of pixels 101 or different between different pixels 101.
Note that, if a part of the waveguide 110 extends over an adjacent pixel 101 as illustrated in
The direction from the center of the first photoelectric conversion portion 121 toward the center of the second photoelectric conversion portion 122 in a single pixel 101 may be different between different pixels 101. That is, the direction may be the X direction in some pixels 101, the Y direction in other pixels 101, and a direction that is oblique with respect to the X direction in yet other pixels 101.
A first embodiment of the present invention will now be described with reference to
In the first embodiment, a high-sensitivity signal is acquired by the first photoelectric conversion portion 121, which receives a larger portion of the light, and a low-sensitivity signal is acquired by the second photoelectric conversion portion 122, which receives a smaller portion of the light. The photoelectric conversion portions 121 and 122 are driven by receiving the respective signals transmitted from the peripheral circuits 104 through the wiring lines 125 and in such a manner as to be exposed to light for the same period of time.
The signals acquired by the respective photoelectric conversion portions 121 and 122 are transferred to the peripheral circuits 104 through the wiring lines 125 and are output from the peripheral circuits 104 to an external device. The signals acquired by the photoelectric conversion portions 121 and 122 may be output from the peripheral circuits 104 as they are. Alternatively, a high-sensitivity signal may be output if the quantity of light that is incident on the pixel 101 is smaller than a threshold, and a low-sensitivity signal may be output if the quantity of light that is incident on the pixel 101 is larger than or equal to the threshold. The threshold is set to a value smaller than a value corresponding to the signal intensity at which the high-sensitivity signal is saturated and larger than a value corresponding to the signal intensity at which the low-sensitivity signal exhibits a desired signal-to-noise (SN) ratio.
Now, a method of widening the dynamic range by using the high-sensitivity signal and the low-sensitivity signal will be described.
The first threshold 1051 is set to a signal intensity lower than the signal intensity at which the high-sensitivity signal 1031 is saturated. The second threshold 1052 is set to a signal intensity at which the SN ratio of the low-sensitivity signal 1032 exceeds a desired value. Hence, a quantity 1062 of light incident on the pixel 101 when the intensity of the low-sensitivity signal 1032 is equal to the second threshold 1052 needs to be smaller than a quantity 1061 of light incident on the pixel 101 when the intensity of the high-sensitivity signal 1031 is equal to the first threshold 1051.
Here, let us consider the case of the related-art solid-state image sensor illustrated in
In such a case, the quantity 1062 of light incident on the pixel 101 when the intensity of the low-sensitivity signal 1032 is equal to the second threshold 1052 is larger than the quantity 1061 of light incident on the pixel 101 when the intensity of the high-sensitivity signal 1031 is equal to the first threshold 1051. Therefore, if the quantity of light incident on the pixel 101 is within a range 1063, the high-sensitivity signal 1031 is saturated while the low-sensitivity signal 1032 has an insufficient SN ratio. Hence, the quality of an image composed may be deteriorated in the range 1063 corresponding to the point of switching between the high-sensitivity signal 1031 and the low-sensitivity signal 1032. The signal is saturated when the quantity of light that is incident on the photoelectric conversion portion exceeds a level at which the amount of charge accumulated in the photoelectric conversion portion reaches the maximum.
Now, let us consider a case where the sensitivity ratio between the first photoelectric conversion portion 1021 and the second photoelectric conversion portion 1022 is low.
As described above, if the angular dependence of the sensitivity ratio between the photoelectric conversion portions 1021 and 1022 is high, the quality of a resulting image may be deteriorated at the point of switching between the two signals or the dynamic range may be narrowed, depending on the states of the lens and the aperture that are used for applying light to the solid-state image sensor. In contrast, in the solid-state image sensor 100 according to the first embodiment of the present invention, the angular dependence of the sensitivity of each of the photoelectric conversion portions 121 and 122 having different levels of sensitivity is low, as typically graphed in
The required sensitivity ratio between the photoelectric conversion portions 121 and 122 is determined on the basis of the dynamic range or the SN ratio that is required for an image to be composed. In the first embodiment, the amount of shift of the center 113 of the exit face 111E of the core 111 with respect to the center 124 of the barrier region 123 may be determined on the basis of the dynamic range or the SN ratio that is required for an image to be composed.
To widen the dynamic range by combining the two signals, the following relationship needs to be satisfied:
C1/S1>C2/S2 (Expression 1)
where S1 and C1 denote the sensitivity and the capacity, respectively, of the first photoelectric conversion portion 121 that acquires the high-sensitivity signal, and S2 and C2 denote the sensitivity and the capacity, respectively, of the second photoelectric conversion portion 122 that acquires the low-sensitivity signal.
The reason for this is as follows. If C1/S1≦C2/S2, the first photoelectric conversion portion 121 that acquires the high-sensitivity signal is saturated with a quantity of light that is smaller than or equal to the quantity of light with which the second photoelectric conversion portion 122 that acquires the low-sensitivity signal is saturated. Moreover, C1/S1 may be twice the C2/S2 or larger. Note that C1/S1 and C2/S2 are each the maximum quantity of light that may be accumulated as charge in a corresponding one of the photoelectric conversion portions 121 and 122.
The ratio between the sensitivity S1 of the first photoelectric conversion portion 121 and the sensitivity S2 of the second photoelectric conversion portion 122 is changeable with the amount of shift of the center 113 of the exit face 111E of the core 111 with respect to the center 124 of the barrier region 123, as described above.
To increase the capacity C1 or C2 of the photoelectric conversion portion 121 or 122, the volume of the photoelectric conversion portion 121 or 122 or the concentration of a dopant for forming the photoelectric conversion portion 121 or 122 may be increased. To increase the volume of the photoelectric conversion portion 121 or 122, the opening of the photoelectric conversion portion 121 or 122 may be widened by increasing the area of ion implantation, or the depth of the photoelectric conversion portion 121 or 122 may be increased by implanting ions deeply into the substrate 120. Note that, if the photoelectric conversion portions 121 and 122 have the same depth and the same dopant concentration, the photoelectric conversion portions 121 and 122 can be formed under the same ion-implantation conditions, whereby the manufacturing process is simplified.
Even if the exposure time for the first photoelectric conversion portion 121 and the exposure time for the second photoelectric conversion portion 122 are not the same, an image with a wide dynamic range can be acquired, as long as the amounts of charge that may be accumulated in the respective photoelectric conversion portions 121 and 122 are different. However, if the exposure time for the first photoelectric conversion portion 121 and the exposure time for the second photoelectric conversion portion 122 are different, the photoelectric conversion portions 121 and 122 particularly cause different levels of motion blur of the object. An image composed in such a manner may appear unnatural. Therefore, the exposure time for the first photoelectric conversion portion 121 and the exposure time for the second photoelectric conversion portion 122 are desired to be the same.
The pixel 101 may include three or more photoelectric conversion portions having different levels of sensitivity. If pixel signals acquired by three or more photoelectric conversion portions having different levels of sensitivity are combined, the dynamic range of an image to be composed can be widened further.
A second embodiment of the present invention will now be described. In the second embodiment, the solid-state image sensor according to the present invention is used such that a plurality of photoelectric conversion portions having different levels of sensitivity are driven for different periods of exposure time, whereby an image taken with low sensitivity and a long exposure time and an image taken with high sensitivity and a short exposure time are acquired simultaneously.
In general, the exposure time required for taking a smooth moving image often becomes longer than the exposure time required for taking a still image. Hereinafter, an image taken with low sensitivity and a long exposure time is regarded as a moving image, and an image taken with high sensitivity and a short exposure time is regarded as a still image. If the exposure time for the still image is longer than the exposure time for the moving image, the photoelectric conversion portions provided for acquiring the moving image and the still image, respectively, only need to be interchanged with each other.
The second embodiment is different from the first embodiment in that the exposure time for the first photoelectric conversion portion 121 is shorter than the exposure time for the second photoelectric conversion portion 122. The first photoelectric conversion portion 121 acquires a still-image signal, and the second photoelectric conversion portion 122 acquires a moving-image signal. The signals thus acquired are output from the peripheral circuits 104 and are used for forming a still image and a moving image, respectively.
Let us consider a case of the related-art solid-state image sensor illustrated in
For example, if the lens used has a large f-number or the size of the aperture is reduced, the incident angle of the light that enters the pixel 1001 is within the angle range 1041 indicated in
For example, if the lens used has a small f-number or the aperture is fully opened, the incident angle of the light that enters the pixel 1001 is within the angle range 1040 indicated in
As described above, if the angular dependence of the sensitivity of each of the photoelectric conversion portions is high, the quality of a resulting still image and a resulting moving image may be deteriorated, depending on the states of the lens and the aperture that are used.
In contrast, in the solid-state image sensor 100 according to the second embodiment of the present invention, the angular dependence of the sensitivity of each of the photoelectric conversion portions 121 and 122 having different levels of sensitivity is low, as typically graphed in
With the solid-state image sensor 100 according to the second embodiment, different images, i.e., a still image and a moving image, can be formed from the still-image signal acquired by the first photoelectric conversion portion 121 and the moving-image signal acquired by the second photoelectric conversion portion 122, respectively. Therefore, the still-image signal and the moving-image signal may be generated with respective levels of intensity that are as close to each other as possible and with respective dynamic ranges that are as close to each other as possible.
Hence, the following relationships may also be satisfied:
S1×T1=S2×T2 (Expression 2)
C1/(S1×T1)=C2/(S2×T2) (Expression 3)
where S1, C1, and T1 denote the sensitivity, the capacity, and the exposure time, respectively, of the first photoelectric conversion portion 121 that acquires the still-image signal, and S2, C2, and T2 denote the sensitivity, the capacity, and the exposure time, respectively, of the second photoelectric conversion portion 122 that acquires the moving-image signal.
Expression 2 defines a condition regarding the signal intensity. Expression 3 defines a condition regarding the dynamic range.
As described above, the ratio between S1 and S2 is controllable by changing the amount of shift of the waveguide 110. As can be seen from Expression 2, in the second embodiment, the ratio between S1 and S2 can be determined by estimating the exposure time for each of a still image and a moving image to be used. For example, if the exposure time for the moving image is 1/60 seconds and the exposure time for the still image is 1/600 seconds, the pixel 101 is configured such that S1 is ten times S2.
To satisfy Expressions 2 and 3 simultaneously, the capacity C1 of the first photoelectric conversion portion 121 and the capacity C2 of the second photoelectric conversion portion 122 may be the same. Herein, “to be the same” implies that errors due to tolerances in the manufacturing process are permissible. Specifically, if the difference between the capacity C1 of the first photoelectric conversion portion 121 and the capacity C2 of the second photoelectric conversion portion 122 is smaller than 10% of the capacity C1 of the first photoelectric conversion portion 121, the capacities C1 and C2 are regarded as being the same.
As described above, the capacity of the photoelectric conversion portion is determined by the volume of the photoelectric conversion portion and the concentration of the dopant for forming the photoelectric conversion portion.
In the case illustrated in
The pixel 101 may include three or more photoelectric conversion portions. If the exposure time is varied among the three or more photoelectric conversion portions and three or more images based on the respectively different periods of exposure time are acquired simultaneously, a plurality of images having different levels of blur can be acquired. Moreover, photoelectric conversion portions to be used may be selected in accordance with the shutter speed that is set. According to Expressions 2 and 3, suitable characteristics of the photoelectric conversion portions vary with the set shutter speed. Hence, if two of the three or more photoelectric conversion portions are selected, a still-image signal and a moving-image signal with close levels of intensity and close dynamic ranges can be acquired simultaneously.
Signals acquired by a plurality of photoelectric conversion portions exposed to light for the same period of time may be used for the acquisition of an image with a wide dynamic range, and signals acquired by a plurality of photoelectric conversion portions exposed to light for different periods of time may be used for the simultaneous acquisition of a still image and a moving image.
The CPU 192 is a circuit that controls the transfer circuit 193, the signal processing unit 194, and the device driving circuit 195. The device driving circuit 195 is a circuit that drives the solid-state image sensor 100 in accordance with the signal from the CPU 192 and controls, for example, the periods of exposure time for the respective photoelectric conversion portions 121 and 122 provided in each of the pixels 101, and the timings of reading the signals acquired by the photoelectric conversion portions 121 and 122. The transfer circuit 193 stores the signals read from the solid-state image sensor 100 and transfers the signals to the signal processing unit 194. The signal processing unit 194 processes the signals acquired through the transfer circuit 193 into an image.
The image pickup apparatus 190 is selectively operable in a dynamic-range-widening mode in which the solid-state image sensor 100 is driven in accordance with the first embodiment or in a moving-image-and-still-image-simultaneous-acquisition mode in which the solid-state image sensor 100 is driven in accordance with the second embodiment. The mode is selectable by a user through an operation unit (not illustrated). The CPU 192 controls the associated circuits in accordance with the mode selected.
If the dynamic-range-widening mode is selected, the solid-state image sensor 100 is activated such that the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 are exposed to light for the same period of time and such that the first photoelectric conversion portion 121 having higher sensitivity acquires a high-sensitivity signal and the second photoelectric conversion portion 122 having lower sensitivity acquires a low-sensitivity signal. If the quantity of light that is incident on the pixel 101 is lower than the threshold, the high-sensitivity signal is used. If the quantity of light that is incident on the pixel 101 is higher than or equal to the threshold, the low-sensitivity signal is used. With a combination of the two signals, an image with a wide dynamic range is formed.
If the moving-image-and-still-image-simultaneous-acquisition mode is selected, the solid-state image sensor 100 is activated such that the exposure time for the first photoelectric conversion portion 121 is shorter than the exposure time for the second photoelectric conversion portion 122. If the exposure time is set to a shorter value for the still image than for the moving image for the purpose of, for example, shooting an object that is moving fast, a still-image signal is acquired by the first photoelectric conversion portion 121 having higher sensitivity while a moving-image signal is acquired by the second photoelectric conversion portion 122 having lower sensitivity. If the exposure time is set to a shorter value for the still image than for the moving image for the purpose of, for example, intentionally adding motion blur, a moving-image signal is acquired by the first photoelectric conversion portion 121 having higher sensitivity while a still-image signal is acquired by the second photoelectric conversion portion 122 having lower sensitivity. The exposure time for the still image is determined by the user. The exposure time for the moving image is set to about the value corresponding to the frame rate of the solid-state image sensor 100. Thus, a still image and a moving image can be formed simultaneously from the still-image signal and the moving-image signal acquired in the above manner.
The solid-state image sensor 100 of the image pickup apparatus 190 is not limited to operate on the basis of only one of the first embodiment and the second embodiment. For example, the solid-state image sensor 100 may have both the mode for acquiring an image with a wide dynamic range and the mode for simultaneously acquiring a moving image and a still image so that the mode is switched between the two in accordance with the image to be acquired. In such a case, the levels of sensitivity and the capacities of the first photoelectric conversion portion 121 and the second photoelectric conversion portion 122 need to satisfy at least Expression 1. In addition, Expressions 2 and 3 may also be satisfied.
If both Expressions 2 and 3 are satisfied, the capacity of the first photoelectric conversion portion 121 and the capacity of the second photoelectric conversion portion 122 are the same. Therefore, Expression 1 is naturally satisfied. Hence, in the solid-state image sensor 100 having both the mode for acquiring an image with a wide dynamic range and the mode for simultaneously acquiring a moving image and a still image, the capacity of the first photoelectric conversion portion 121 and the capacity of the second photoelectric conversion portion 122 may be the same.
To summarize, the image pickup apparatus 190 according to the third embodiment is capable of acquiring an excellent image with a wide dynamic range and is also capable of simultaneously acquiring a moving image and a still image.
According to any of the embodiments of the present invention, a solid-state image sensor is provided in which the angular dependence of the sensitivity of each of a plurality of photoelectric conversion portions provided in each pixel and having different levels of sensitivity is reduced, and the deterioration of image quality that may occur depending on the states of a camera lens and an aperture that are used is suppressed.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2015-024485, filed Feb. 10, 2015, which is hereby incorporated by reference herein in its entirety.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
8866947, | May 02 2011 | Aptina Imaging Corporation | Double pass back side image sensor systems and methods |
20030002836, | |||
20110032398, | |||
20120199893, | |||
20130088626, | |||
20140152878, | |||
20150015752, | |||
JP2004120391, | |||
JP2004363193, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 28 2016 | NUMATA, AIHIKO | Canon Kabushiki Kaisha | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038647 | /0293 | |
Feb 04 2016 | Canon Kabushiki Kaisha | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Feb 27 2020 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
May 06 2024 | REM: Maintenance Fee Reminder Mailed. |
Oct 21 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Sep 13 2019 | 4 years fee payment window open |
Mar 13 2020 | 6 months grace period start (w surcharge) |
Sep 13 2020 | patent expiry (for year 4) |
Sep 13 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 13 2023 | 8 years fee payment window open |
Mar 13 2024 | 6 months grace period start (w surcharge) |
Sep 13 2024 | patent expiry (for year 8) |
Sep 13 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 13 2027 | 12 years fee payment window open |
Mar 13 2028 | 6 months grace period start (w surcharge) |
Sep 13 2028 | patent expiry (for year 12) |
Sep 13 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |