An image sensor may include an array of imaging pixels. Each imaging pixel may have a photosensitive area that is covered by a respective multipart diffractive lens to focus light onto the photosensitive area. The multipart diffractive lenses may have multiple portions with different indices of refraction. The portions of the diffractive lenses closer to the center of the diffractive lenses may have higher indices of refraction to focus light. Alternatively, the portions of the diffractive lenses closer to the center of the diffractive lenses may have lower indices of refraction to defocus light. The multipart diffractive lenses may have stacked layers with the same refractive indices but different widths.
|
1. An image sensor comprising a plurality of imaging pixels, wherein each imaging pixel of the plurality of imaging pixels comprises:
a photodiode; and
a diffractive lens formed over the photodiode,
wherein the diffractive lens has an edge portion with a first refractive index and a center portion with a second refractive index that is different than the first refractive index and wherein the edge portion is adjacent a solid material with a third refractive index that is different than the first and second refractive indices.
5. An image sensor comprising a plurality of imaging pixels, wherein each imaging pixel of the plurality of imaging pixels comprises:
a photodiode; and
a diffractive lens formed over the photodiode, wherein the diffractive lens has a first portion with a first refractive index, a second portion with a second refractive index, and a third portion with a third refractive index, wherein the second portion is interposed between the first portion and the third portion and wherein the second refractive index is less than the first refractive index and greater than the first refractive index.
9. An image sensor comprising a plurality of imaging pixels, wherein each imaging pixel of the plurality of imaging pixels comprises:
a photosensitive area;
a color filter element formed over the photosensitive area; and
a multipart diffractive lens formed over the color filter element that focuses incident light on the photosensitive area, wherein the multipart diffractive lens comprises a first layer and a second layer formed over the first layer, wherein the first layer has a first width, wherein the second layer has a second width that is greater than the first width, and wherein the first and second layers have the same refractive index.
2. The image sensor defined in
3. The image sensor defined in
4. The image sensor defined in
6. The image sensor defined in
7. The image sensor defined in
8. The image sensor defined in
10. The image sensor defined in
11. The image sensor defined in
12. The image sensor defined in
13. The image sensor defined in
|
This relates generally to image sensors and, more particularly, to image sensors having lenses to focus light.
Image sensors are commonly used in electronic devices such as cellular telephones, cameras, and computers to capture images. In a typical arrangement, an electronic device is provided with an array of image pixels arranged in pixel rows and pixel columns. Each image pixel in the array includes a photodiode that is coupled to a floating diffusion region via a transfer gate. Each pixel receives incident photons (light) and converts the photons into electrical signals. Column circuitry is coupled to each pixel column for reading out pixel signals from the image pixels. Image sensors are sometimes designed to provide images to electronic devices using a Joint Photographic Experts Group (JPEG) format.
Conventional image sensors sometimes include a color filter element and a microlens above each pixel. The microlenses of conventional image sensors typically have curved surfaces and use refraction to focus light on an underlying photodiode. However, these types of microlenses may allow peripheral light to pass through the microlenses without being focused, leading to optical cross-talk.
It would therefore be desirable to provide improved lenses for image sensors.
Embodiments of the present invention relate to image sensors with pixels that include diffractive lenses. An electronic device with a digital camera module is shown in
Still and video image data from image sensor 16 may be provided to image processing and data formatting circuitry 14 via path 27. Image processing and data formatting circuitry 14 may be used to perform image processing functions such as automatic focusing functions, depth sensing, data formatting, adjusting white balance and exposure, implementing video image stabilization, face detection, etc. For example, during automatic focusing operations, image processing and data formatting circuitry 14 may process data gathered by phase detection pixels in image sensor 16 to determine the magnitude and direction of lens movement (e.g., movement of lens 29) needed to bring an object of interest into focus.
Image processing and data formatting circuitry 14 may also be used to compress raw camera image files if desired (e.g., to Joint Photographic Experts Group or JPEG format). In a typical arrangement, which is sometimes referred to as a system on chip (SOC) arrangement, camera sensor 16 and image processing and data formatting circuitry 14 are implemented on a common integrated circuit. The use of a single integrated circuit to implement camera sensor 16 and image processing and data formatting circuitry 14 can help to reduce costs. This is, however, merely illustrative. If desired, camera sensor 14 and image processing and data formatting circuitry 14 may be implemented using separate integrated circuits. If desired, camera sensor 16 and image processing circuitry 14 may be formed on separate semiconductor substrates. For example, camera sensor 16 and image processing circuitry 14 may be formed on separate substrates that have been stacked.
Camera module 12 may convey acquired image data to host subsystems 19 over path 18 (e.g., image processing and data formatting circuitry 14 may convey image data to subsystems 19). Electronic device 10 typically provides a user with numerous high-level functions. In a computer or advanced cellular telephone, for example, a user may be provided with the ability to run user applications. To implement these functions, host subsystem 19 of electronic device 10 may include storage and processing circuitry 17 and input-output devices 21 such as keypads, input-output ports, joysticks, and displays. Storage and processing circuitry 17 may include volatile and nonvolatile memory (e.g., random-access memory, flash memory, hard drives, solid state drives, etc.). Storage and processing circuitry 17 may also include microprocessors, microcontrollers, digital signal processors, application specific integrated circuits, or other processing circuits.
As shown in
Row control circuitry 26 may receive row addresses from control circuitry 24 and supply corresponding row control signals such as reset, row-select, charge transfer, dual conversion gain, and readout control signals to pixels 22 over row control paths 30. One or more conductive lines such as column lines 32 may be coupled to each column of pixels 22 in array 20. Column lines 32 may be used for reading out image signals from pixels 22 and for supplying bias signals (e.g., bias currents or bias voltages) to pixels 22. If desired, during pixel readout operations, a pixel row in array 20 may be selected using row control circuitry 26 and image signals generated by image pixels 22 in that pixel row can be read out along column lines 32.
Image readout circuitry 28 may receive image signals (e.g., analog pixel values generated by pixels 22) over column lines 32. Image readout circuitry 28 may include sample-and-hold circuitry for sampling and temporarily storing image signals read out from array 20, amplifier circuitry, analog-to-digital conversion (ADC) circuitry, bias circuitry, column memory, latch circuitry for selectively enabling or disabling the column circuitry, or other circuitry that is coupled to one or more columns of pixels in array 20 for operating pixels 22 and for reading out image signals from pixels 22. ADC circuitry in readout circuitry 28 may convert analog pixel values received from array 20 into corresponding digital pixel values (sometimes referred to as digital image data or digital pixel data). Image readout circuitry 28 may supply digital pixel data to control and processing circuitry 24 over path 25 for pixels in one or more pixel columns.
Lens 42 may be transparent to incident light. Therefore, some light may pass through the lens without being focused. For example, incident light 46-1 may pass through the center of diffractive lens 42. The corresponding light 46-2 on the other side of the diffractive lens may travel in the same direction as incident light 46-1. In contrast, incident light at the edge of diffractive lens 42 may be redirected due to diffraction. For example, incident light 46-3 may pass by the edge of diffractive lens 42. The light may be redirected such that the output light 46-4 travels at an angle 48 relative to the incident light 46-3. In other words, the diffractive lens redirects the light at the edge of the lens using diffraction.
Diffraction occurs when a wave (such as light) encounters an obstacle. When light passes around the edge of an object, it will be bent or redirected such that the direction of the original incident light changes. The amount and direction of bending depends on numerous factors. In an imaging sensor, diffraction of light can be used (with diffractive lenses) to redirect incident light in desired ways (i.e., focusing incident light on photodiodes to mitigate optical cross-talk).
In the example of
As shown in
Lens 50 may be transparent to incident light. Therefore, some light may pass through the lens without being focused. For example, incident light 46-1 may pass through the center of diffractive lens 50. The corresponding light 46-2 on the other side of the diffractive lens may travel in the same direction as incident light 46-1. In contrast, incident light at the edge of diffractive lens 50 may be redirected due to diffraction. For example, incident light 46-3 may pass by the edge of diffractive lens 50. The light may be redirected such that the output light 46-4 travels at an angle 54 relative to the incident light 46-3. In other words, the diffractive lens redirects the light at the edge of the lens using diffraction.
In addition to the refractive indices of the diffractive lens and the surrounding material, the thickness of the diffractive lens may also affect the response of incident light to the diffractive lens.
In particular, incident light 46-3 may pass by the edge of diffractive lens 42. The light may be redirected such that the output light 46-4 travels at an angle 48-1 relative to the incident light 46-3. This angle may be dependent upon the thickness 56 of diffractive lens 42. In the example of
In contrast, diffractive lens 42 in
Diffractive lenses 42 in
This shows how diffractive lenses may be used to redirect incident light in desired ways. The refractive indices of the lens and surrounding material may be altered to customize the response of incident light. Additionally, the thickness, length, and width, of the diffractive lens may be altered to customize the response of incident light.
In
The aforementioned single-edge diffractive lenses may be effective at focusing or defocusing light at the edges of the diffractive lens. Light at the center of the diffractive lenses may pass through without being focused or defocused as desired. However, light between the center and edges of the diffractive lenses passes through the diffractive lens without being focused or defocused. This may not be desirable, as performance of the lens may be improved if light between the center and edges of the diffractive lens was also focused or defocused.
To better focus light, a diffractive lens may therefore have two or more portions with different refractive indices. Examples of this type are shown in
As shown in
Lens 62 (i.e., both portions 64 and 66 of lens 62) may be transparent to incident light. Therefore, some light may pass through the lens without being focused. For example, incident light 46-1 may pass through the center of portion 66 of diffractive lens 62. The corresponding light 46-2 on the other side of the diffractive lens may travel in the same direction as incident light 46-1. In contrast, incident light at the edge of diffractive lens 62 may be redirected due to diffraction. For example, incident light 46-3 may pass by the edge of diffractive lens 62. The light may be redirected such that the output light 46-4 travels at an angle relative to the incident light 46-3. In other words, the diffractive lens redirects the light at the edge of the lens using diffraction. Additionally, due to the additional refractive index difference between portions 64 and 66 of the diffractive lens, light between the edge and center of the diffractive lens may also be redirected. For example, incident light 46-5 may pass by the interface of portions 64 and 66 of diffractive lens 62. The light may be redirected such that the output light 46-6 travels at an angle relative to the incident light 46-5.
The difference in refractive index between each material may be any desired refractive index difference (e.g., greater than 0.2, greater than 0.3, greater than 0.4, greater than 0.5, greater than 0.8, greater than 1.0, between 0.2 and 0.5, between 0.2 and 0.8, between 0.2 and 1.0, less than 1.0, less than 0.5, less than 0.3, etc.).
The example of the diffractive lens having two portions in
In the example of
As shown in
Lens 72 (i.e., both portions 74 and 76 of lens 72) may be transparent to incident light. Therefore, some light may pass through the lens without being focused. For example, incident light 46-1 may pass through the center of portion 76 of diffractive lens 72. The corresponding light 46-2 on the other side of the diffractive lens may travel in the same direction as incident light 46-1. In contrast, incident light at the edge of diffractive lens 72 may be redirected due to diffraction. For example, incident light 46-3 may pass by the edge of diffractive lens 72. The light may be redirected such that the output light 46-4 travels at an angle relative to the incident light 46-3. In other words, the diffractive lens redirects the light at the edge of the lens using diffraction. Additionally, due to the additional refractive index difference between portions 74 and 76 of the diffractive lens, light between the edge and center of the diffractive lens may also be redirected. For example, incident light 46-5 may pass by the interface of portions 74 and 76 of diffractive lens 72. The light may be redirected such that the output light 46-6 travels at an angle relative to the incident light 46-5.
The difference in refractive index between each material may be any desired refractive index difference (e.g., greater than 0.2, greater than 0.3, greater than 0.4, greater than 0.5, greater than 0.8, greater than 1.0, between 0.2 and 0.5, between 0.2 and 0.8, between 0.2 and 1.0, less than 1.0, less than 0.5, less than 0.3, etc.). The example of the diffractive lens having two portions in
The difference in refractive index between each material may be any desired refractive index difference (e.g., greater than 0.2, greater than 0.3, greater than 0.4, greater than 0.5, greater than 0.8, greater than 1.0, between 0.2 and 0.5, between 0.2 and 0.8, between 0.2 and 1.0, less than 1.0, less than 0.5, less than 0.3, etc.). The example of the diffractive lens having two portions in
The asymmetric diffractive lens may instead be a defocusing diffractive lens. As shown in
The difference in refractive index between each material may be any desired refractive index difference (e.g., greater than 0.2, greater than 0.3, greater than 0.4, greater than 0.5, greater than 0.8, greater than 1.0, between 0.2 and 0.5, between 0.2 and 0.8, between 0.2 and 1.0, less than 1.0, less than 0.5, less than 0.3, etc.).
The example of the diffractive lens having two portions in
The diffractive lenses of
As discussed in connection with
Color filters such as color filter elements 86 may be interposed between diffractive lenses 62 and substrate 80. Color filter elements 86 may filter incident light by only allowing predetermined wavelengths to pass through color filter elements 86 (e.g., color filter 86 may only be transparent to the certain ranges of wavelengths). Color filter elements 86 may be part of a color filter array formed on the back surface of substrate 80. A respective diffractive lens 62 may cover each color filter element 86 in the color filter array. This example is merely illustrative. If desired, the diffractive lenses may be formed under color filter elements 86 such that the diffractive lenses are interposed between the color filter elements 86 and photosensitive regions 82. Light can enter from the back side of the image pixels through diffractive lenses 62. While in
Color filters 86 may include green filters, red filters, blue filters, yellow filters, cyan filters, magenta filters, clear filters, infrared filters, or other types of filters. As an example, a green filter passes green light (e.g., light with wavelengths from 495 nm to 570 nm) and reflects and/or absorbs light out of that range (e.g., the green filter reflects red light and blue light). An example of a color filter array pattern that may be used is the GRBG (green-red-blue-green) Bayer pattern. In this type of configuration, the color filter array is arranged into groups of four color filters. In each group, two of the four color filters are green filters, one of the four color filters is a red filter, and the remaining color filter is a blue filter. If desired, other color filter array patterns may be used.
A layer 94 (sometimes referred to as a planarization layer, passivation layer, dielectric layer, film, planar film, or planarization film) may be formed over diffractive lenses 62. Planarization layer 94 may be formed across the entire array of imaging pixels in image sensor 16. Layer 94 may have any desired index of refraction (e.g., greater than, less than, or equal to the index of refraction of portions 64 of diffractive lenses 62). A second layer 92 may be formed between diffractive lenses 62 (e.g., layer 92 may be interposed between the side surfaces of adjacent diffractive lenses 62). Layer 92 may have an index of refraction that is less than the index of refraction of portions 64 of diffractive lenses 62 when diffractive lenses are used to focus light. Alternatively, however, layer 92 may have an index of refraction that is greater than the index of refraction of portions 64 of the diffractive lenses if the diffractive lenses were used to defocus light. A third layer 90 may be formed under diffractive lenses 62 between diffractive lenses 62 and color filters 86. Layer 90 may have any desired index of refraction (e.g., greater than, less than, or equal to the index of refraction of portions 64 of diffractive lenses 62). Layers 90, 92, and 94 may be transparent and may be formed from any desired materials. Layers 90, 92, and 94 may be formed from the same materials or different materials. In one possible example, layers 90, 92, and 94 may all be formed from the same material and the diffractive lenses may be embedded within the material. Layers 90, 92, and 94 may sometimes be referred to as planarization layers, dielectric layers, or cladding layers. In some cases, one or more of layers 90, 92, and 94 may be formed from air (i.e., an air gap may present be between diffractive lenses 62).
The difference in refractive index between each diffractive lens portion may be any desired refractive index difference (e.g., greater than 0.2, greater than 0.3, greater than 0.4, greater than 0.5, greater than 0.8, greater than 1.0, between 0.2 and 0.5, between 0.2 and 0.8, between 0.2 and 1.0, less than 1.0, less than 0.5, less than 0.3, etc.).
Each portion of diffractive lenses 62 may be formed from any desired material. It may be desirable for diffractive lenses 62 to be transparent and formed from a material with a higher refractive index than the surrounding materials (e.g., layer 92). Each portion of each diffractive lens may be formed from silicon nitride (with a refractive index of approximately 2.0), from silicon dioxide (with a refractive index of approximately 1.45), from silicon oxynitride (with a refractive index of approximately 1.8), or any other desired material. In general, each portion of each diffractive lens 62 may have any desired index of refraction (e.g., between 1.8 and 2.0, between 1.6 and 2.2, between 1.5 and 2.5, between 1.5 and 2.0, more than 1.3, more than 1.6, more than 1.8, more than 2.0, less than 2.0, less than 1.8, etc.). Layer 92 may also be transparent and formed from a material with any desired refractive index (e.g., a lower refractive index than portions 64 of diffractive lenses 62). Planar layer 92 may be formed from a material with a refractive index between 1.3 and 1.5, between 1.2 and 1.8, greater than 1.3, or any other desired refractive index.
The refractive indices of the portions of diffractive lenses 62 and the surrounding material (e.g., layer 92) may be selected such that light is focused by the diffractive lenses towards the photodiodes of the pixels.
As previously discussed, the refractive indices of the diffractive lenses and surrounding materials, as well as the dimensions of the diffractive lenses, may be altered to customize the response to incident light. Additionally, the distance 102 between each diffractive lens may be altered to change the response of incident light.
In some embodiments, the diffractive lens over each pixel in the pixel array may be the same. However, in other embodiments different pixels may have unique diffractive lenses to further customize the response to incident light.
In the example of
In the example of
Each diffractive lens 62 may have any desired shape.
In the example of
In
The stacked diffractive lenses of
In
In
In various embodiments, an image sensor may include a plurality of imaging pixels and each imaging pixel may include a photodiode and a diffractive lens formed over the photodiode. The diffractive lens may have a first portion with a first refractive index and a second portion with a second refractive index that is different than the first refractive index.
The first portion may be an edge portion and the second portion may be a center portion and the first refractive index may be less than the second refractive index. The edge portion may laterally surround the center portion. The edge portion may be adjacent a material with a third refractive index that is different than the first and second refractive indices. The third refractive index may be less than the first refractive index.
Each diffractive lens may also include a third portion with a third refractive index that is different than the first and second refractive indices. The second portion may be interposed between the first portion and the third portion and the second refractive index may be less than the first refractive index and greater than the first refractive index. The diffractive lens of each imaging pixel may have a planar upper surface and a planar lower surface. No microlens with a curved surface may be formed over the diffractive lens of each pixel. Each imaging pixel of the plurality of imaging pixels may also include a color filter element interposed between the photodiode and the diffractive lens of that imaging pixel.
In various embodiments, an image sensor may include a plurality of imaging pixels and each imaging pixel of the plurality of imaging pixels may include a photosensitive area, a color filter element formed over the photosensitive area, and a multipart diffractive lens formed over the color filter element that focuses incident light on the photosensitive area.
The multipart diffractive lens may include first and second portions having respective first and second refractive indices. The second portion may laterally surround the first portion. The second refractive index may be less than the first refractive index. The multipart diffractive lens may be surrounded by a material having a third refractive index that is less than the second refractive index. The multipart diffractive lens may include a first layer and a second layer formed over the first layer, the first layer may have a first width and the second layer may have a second width that is greater than the first width. The first and second layers may have the same refractive index and may be surrounded by a material with an additional refractive index that is less than the refractive index of the first and second layers. No microlens with a curved surface may be formed over the multipart diffractive lens of each pixel.
In various embodiments, an image sensor may include a plurality of imaging pixels and each imaging pixel of the plurality of imaging pixels may include a photodiode and a diffractive lens formed over the photodiode. The diffractive lens may have a plurality of portions with respective refractive indices that increase as a distance of the respective portion from an edge of the diffractive lens increases. The plurality of portions may be concentric.
The foregoing is merely illustrative of the principles of this invention and various modifications can be made by those skilled in the art without departing from the scope and spirit of the invention.
Patent | Priority | Assignee | Title |
10777609, | Apr 01 2019 | VisEra Technologies Company Limited | Optical devices with light collection elements formed in pixels |
11264414, | Jan 11 2019 | Samsung Electronics Co., Ltd. | Image sensor |
Patent | Priority | Assignee | Title |
4878735, | Jan 15 1988 | OPTIMAX CORPORATION | Optical imaging system using lenticular tone-plate elements |
5734155, | Jun 07 1995 | LSI Logic Corporation | Photo-sensitive semiconductor integrated circuit substrate and systems containing the same |
9099580, | Apr 18 2011 | STMicroelectronics S.A. | Elementary image acquisition or display device |
20050110104, | |||
20050242271, | |||
20060093793, | |||
20060145056, | |||
20060177959, | |||
20060292735, | |||
20070001252, | |||
20070127125, | |||
20070278604, | |||
20090090937, | |||
20090127440, | |||
20090160965, | |||
20100091168, | |||
20110096210, | |||
20110234830, | |||
20130015545, | |||
20130038691, | |||
20130240962, | |||
20140091205, | |||
20140197301, | |||
20140313379, | |||
20150109501, | |||
20160111461, | |||
20160211306, | |||
20160269662, | |||
20160351610, | |||
20160377871, | |||
20170077163, | |||
20170133420, | |||
20170141145, | |||
20170176787, | |||
20180026065, | |||
20180145103, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 07 2018 | SEMIDUCTOR COMPONENTS INDUSTRIES, LLC | (assignment on the face of the patent) | / | |||
Sep 07 2018 | LEE, BYOUNGHEE | Semiconductor Components Industries, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 046819 | /0898 | |
Oct 18 2018 | Semiconductor Components Industries, LLC | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 047399 | /0631 | |
Oct 18 2018 | Fairchild Semiconductor Corporation | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 047399 | /0631 | |
Jun 22 2023 | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | Semiconductor Components Industries, LLC | RELEASE OF SECURITY INTEREST IN PATENTS RECORDED AT REEL 047399, FRAME 0631 | 064078 | /0001 | |
Jun 22 2023 | DEUTSCHE BANK AG NEW YORK BRANCH, AS COLLATERAL AGENT | Fairchild Semiconductor Corporation | RELEASE OF SECURITY INTEREST IN PATENTS RECORDED AT REEL 047399, FRAME 0631 | 064078 | /0001 |
Date | Maintenance Fee Events |
Sep 07 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Apr 20 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Nov 19 2022 | 4 years fee payment window open |
May 19 2023 | 6 months grace period start (w surcharge) |
Nov 19 2023 | patent expiry (for year 4) |
Nov 19 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 19 2026 | 8 years fee payment window open |
May 19 2027 | 6 months grace period start (w surcharge) |
Nov 19 2027 | patent expiry (for year 8) |
Nov 19 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 19 2030 | 12 years fee payment window open |
May 19 2031 | 6 months grace period start (w surcharge) |
Nov 19 2031 | patent expiry (for year 12) |
Nov 19 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |