A pixel group of an image sensor includes first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns, and a common floating diffusion region in a semiconductor substrate at a center of the pixel group and shared by the first through fourth unit pixels. Each of the first through fourth unit pixels includes a photoelectric conversion element in the semiconductor substrate, and a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate. The pair of vertical transfer gates transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region. image quality is enhanced by increasing sensing sensitivity of the unit pixel through the shared structure of the floating diffusion region and the symmetric structure of the vertical transfer gates.
|
1. A pixel group of an image sensor, the pixel group comprising:
first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns; and
a common floating diffusion region in a semiconductor substrate at a center of the pixel group and shared by the first through fourth unit pixels,
each of the first through fourth unit pixels comprising:
a photoelectric conversion element in the semiconductor substrate; and
a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate, the pair of vertical transfer gates configured to transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region, and the pair of vertical transfer gates are configured to be activated simultaneously to perform a low-luminance sensing.
15. A pixel array of an image sensor, the pixel array comprising:
a plurality of pixel groups, each pixel group comprising:
first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns; and
a common floating diffusion region in a semiconductor substrate at a center of each pixel group and shared by the first through fourth unit pixels,
each of the first through fourth unit pixels comprising:
a photoelectric conversion element in the semiconductor substrate; and
a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate, the pair of vertical transfer gates configured to transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region, and the pair of vertical transfer gates are configured to be activated simultaneously to perform a low-luminance sensing.
20. An image sensor comprising:
a pixel array including a plurality of pixel groups configured to collect photo charges generated by an incident light;
a row driver configured to drive the pixel array row by row; and
a controller configured to control the pixel array and the row driver, each pixel group comprising:
first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns; and
a common floating diffusion region in a semiconductor substrate at a center of each pixel group and shared by the first through fourth unit pixels,
each of the first through fourth unit pixels comprising:
a photoelectric conversion element in the semiconductor substrate; and
a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate, the pair of vertical transfer gates configured to transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region, and the pair of vertical transfer gates are configured to be activated simultaneously to perform low-luminance sensing.
2. The pixel group of
3. The pixel group of
4. The pixel group of
5. The pixel group of
6. The pixel group of
trench structures in the semiconductor substrate and extending in the vertical direction from an upper surface of the semiconductor substrate and to a lower surface of the semiconductor substrate to electrically and optically separate the first through fourth unit pixels.
7. The pixel group of
inter-group trench structures separating the pixel group from other pixel groups; and
inter-pixel trench structures separating the first through fourth unit pixels included in the pixel group from each other.
8. The pixel group of
a first inter-pixel trench structure extending in a first horizontal direction to be connected to the inter-group trench structures at both sides of the pixel group in the first horizontal direction and extending in the vertical direction from the upper surface of the semiconductor substrate to the lower surface of the semiconductor substrate; and
a second inter-pixel trench structure extending in a second horizontal direction perpendicular to the first horizontal direction to be connected to the inter-group trench structures both sides of the pixel group in the second horizontal direction and extending in the vertical direction from the upper surface of the semiconductor substrate to the lower surface of the semiconductor substrate.
9. The pixel group of
10. The pixel group of
11. The pixel group of
12. The pixel group of
a common microlens above or below the semiconductor substrate, the common microlens covering all of the first through fourth photoelectric conversion elements respectively included in the first through fourth unit pixels, the common microlens configured to focus an incident light to the first through fourth photoelectric conversion elements.
13. The pixel group of
two common microlens of an ellipse shape above or below the semiconductor substrate, each of the two common microlens of the ellipse shape covering two of first through fourth photoelectric conversion elements respectively included in the first through fourth unit pixels, the two common microlens configured to focus an incident light to the two of the first through fourth photoelectric conversion elements.
14. The pixel group of
a single color filter shared by the first through fourth unit pixels.
16. The pixel array of
17. The pixel array of
18. The pixel array of
first through fourth pixel groups in a matrix form of two group rows and two group columns.
19. The pixel array of
first through fourth pixel groups in a matrix form of a first group row, a second group row, a first group column and a second group column;
fifth through eighth pixel groups in a matrix form of the first group row, the second group row, a third group column and a fourth group column;
ninth through twelfth pixel groups in a matrix form of a third group row, a fourth group row, the first group column and the second group column; and
thirteenth through sixteenth pixel groups in a matrix form of the third group row, the fourth group row, the third group column and the fourth group column.
|
This U.S. non-provisional application claims priority under 35 USC § 119 to Korean Patent Application No. 10-2020-0177204, filed on Dec. 17, 2020, in the Korean Intellectual Property Office (KIPO), the disclosure of which is incorporated by reference herein in its entirety.
Example embodiments relate generally to semiconductor integrated circuits, and more particularly to a pixel group and a pixel array including the pixel group of an image sensor.
Complementary metal oxide semiconductor (CMOS) image sensors are solid-state sensing devices that use complementary metal oxide semiconductors. CMOS image sensors have lower manufacturing costs and/or lower power consumption compared with charge-coupled device (CCD) image sensors. Thus CMOS image sensors are used for various electronic appliances including portable devices such as, for example, smartphones and digital cameras.
A pixel array included in a CMOS image sensor may include a photoelectric conversion element such as a photodiode in each pixel. The photoelectric conversion element generates an electrical signal that varies based on the quantity of incident light. The CMOS image sensor processes electrical signals to synthesize an image. With the recent proliferation of high-resolution images, pixels included in the CMOS image sensor are becoming much smaller. When the pixels get smaller, incident light may not be properly sensed or noise may occur due to interference between highly integrated elements. Also the CMOS image sensor is required to have enhanced image quality and to perform additional functions other than capturing an image.
Some example embodiments may provide a pixel group of an image sensor having enhanced sensing sensitivity and a pixel array including the pixel group.
According to example embodiments, a pixel group of an image sensor includes first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns, and a common floating diffusion region in a semiconductor substrate at a center of the pixel group and shared by the first through fourth unit pixels. Each of the first through fourth unit pixels includes a photoelectric conversion element in the semiconductor substrate, and a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate. The pair of vertical transfer gates transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region.
According to example embodiments, a pixel array of an image sensor includes a plurality of pixel groups. Each pixel group includes first through fourth unit pixels in a matrix form of two pixels rows and two pixel columns, and a common floating diffusion region in a semiconductor substrate at a center of each pixel group and shared by the first through fourth unit pixels. Each of the first through fourth unit pixels includes a photoelectric conversion element in the semiconductor substrate, and a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate.
According to example embodiments, an image sensor includes a pixel array including a plurality of pixel groups configured to collect photo charges generated by an incident light, a row driver configured to drive the pixel array row by row, and a controller configured to control the pixel array and the row driver. Each pixel group includes first through fourth unit pixels arranged in a matrix form of two pixels rows and two pixel columns and a common floating diffusion region in a semiconductor substrate at a center of each pixel group and shared by the first through fourth unit pixels. Each of the first through fourth unit pixels includes a photoelectric conversion element in the semiconductor substrate, and a pair of vertical transfer gates in the semiconductor substrate and extending in a vertical direction perpendicular to a surface of the semiconductor substrate.
The pixel array and the image sensor including the pixel group according to example embodiments may enhance an image quality by increasing sensing sensitivity of the unit pixel through the shared structure of the floating diffusion region and/or the symmetric structure of the vertical transfer gates.
In addition, the pixel group according to example embodiments may reduce cross-talk between the unit pixels and/or further enhance the image quality through the trench structure extending in the vertical direction from the upper surface of the semiconductor substrate and to the lower surface of the semiconductor substrate.
In addition, the pixel array and the image sensor including the pixel group according to example embodiments may more efficiently implement a high dynamic range (HDR) through independent driving of the two vertical transfer gates in each unit pixel.
Example embodiments of the present disclosure will be more clearly understood from the following detailed description taken in conjunction with the accompanying drawings.
Various example embodiments will be described more fully hereinafter with reference to the accompanying drawings, in which some example embodiments are shown. In the drawings, like numerals refer to like elements throughout. The repeated descriptions may be omitted.
Hereinafter, structures according to example embodiments are described using a first horizontal direction DR1, a second horizontal direction DR2, and a vertical direction DR3 in a three-dimensional space. The first horizontal direction DR1 and the second horizontal direction DR2 may be substantially parallel with an upper surface of a semiconductor substrate and substantially perpendicular to each other. The vertical direction DR3 may be substantially perpendicular to the upper surface of the semiconductor substrate. The first direction DR1 may be a row direction and the second direction DR2 may be a column direction.
Referring to
The first through fourth unit pixels PX11˜PX22 may be arranged in a matrix form of two pixels rows PR1 and PR2 and two pixel columns PC1 and PC2. The common floating diffusion region CFD may be disposed in a semiconductor substrate 100 at a center of the pixel group PXG and may be shared by the first through fourth unit pixels PX11˜PX22.
The trench structures 400 and 500 may be disposed in the semiconductor substrate 100 and extend in the vertical direction DR3 from the upper surface 100a of the semiconductor substrate 100 and to the lower surface 100b of the semiconductor substrate 100 to electrically and optically separate the unit pixels PX11˜PX22 from each other. The trench structures 400 and 500 may include inter-group trench structures 400 separating the pixel group PXG from other pixel groups and inter-pixel trench structures 500 separating the unit pixels PX11˜PX22 included in the pixel group PXG from each other.
The inter-pixel trench structures 500 may include a first inter-pixel trench structure 500x and a second inter-pixel trench structure 500y. The first inter-pixel trench structure 500x may extend in the first horizontal direction DR1 to be connected to the inter-group trench structures 400 disposed at both sides of the pixel group PXG in the first horizontal direction DR1 and extend in the vertical direction DR3 from the upper surface 100a of the semiconductor substrate 100 to the lower surface 100b of the semiconductor substrate 100. The second inter-pixel trench structure 500y may extend in the second horizontal direction DR2 to be connected to the inter-group trench structures 400 disposed at both sides of the pixel group PXG in the second horizontal direction DR2 and extend in the vertical direction DR3 from the upper surface 100a of the semiconductor substrate 100 to the lower surface 100b of the semiconductor substrate 100.
The inter-pixel trench structure 500 may reduce or prevent the light incident of adjacent unit pixels from being propagated to each unit pixel. In addition, the inter-pixel trench structure 500 may reduce or prevent the photo charges generated by adjacent unit pixels from being transferred to each unit pixel. In other words, the inter-pixel trench structure 500 may reduce or prevent cross-talk between the photoelectric conversion elements PD11˜PD22. In addition, the inter-group trench structure 400 may reduce or prevent cross-talk between adjacent pixel groups.
The pixel group PXG has a structure such that at least a portion of the first inter-pixel trench structure 501x and at least a portion of the second inter-pixel trench structure 501y may be removed in a cross region CREG of the first inter-pixel trench structure 501x and the second inter-pixel trench structure 501y. The electrons may overflow through the removed portion between the unit pixels PX11˜PX22 and the amount of the overflowing electrons may be controlled by a potential profile in the semiconductor substrate 100.
Each of the first through fourth unit pixels PX11˜PX22 may include a photoelectric conversion element disposed in the semiconductor substrate, and a pair of vertical transfer gates disposed in the semiconductor substrate 100 and extending in a vertical direction DR3. The pair of vertical transfer gates may transfer photo charges collected by the photoelectric conversion element to the common floating diffusion region CFD. The first unit pixel PX11 may include a first photodiode PD11, a first vertical transfer gate VTG1 and a second vertical transfer gate VTG2, the second unit pixel PX12 may include a second photodiode PD12 (not shown), a third vertical transfer gate VTG3 and a fourth vertical transfer gate VTG4, the third unit pixel PX21 may include a third photodiode PD21, a fifth vertical transfer gate VTG5 and a sixth vertical transfer gate VTG6, and the fourth unit pixel PX22 may include a fourth photodiode PD22 (not shown), a seventh vertical transfer gate VTG7 and a eighth vertical transfer gate VTG8.
The pixel group PXG may be symmetrical with respect to a first horizontal line HLX passing through a center CP of the pixel group PXG and extending in the first horizontal direction DR1. In addition, the pixel group PXG may be symmetrical with respect to a second horizontal line HLY passing through the center CP of the pixel group PXG and extending in the second horizontal direction DR2. According to example embodiments, the pixel group PXG may be symmetrical with a vertical line VLZ passing through the center CP of the pixel group PXG and extending in the vertical direction DR3.
Hereinafter, only the structure of the first unit pixel PX11 is further described with reference to
The pair of vertical transfer gates included in each unit pixel may be electrically insulated from each other, and the pair of vertical transfer gates included in each unit pixel may be controlled independently of each other by a pair of transfer control signals. For example, as illustrated in
The pair of vertical transfer gates VTG1 and VTG2 included in the pixel group PG11 may be symmetric with respect to a vertical plane passing through a diagonal line HDL in
As illustrated in
In some example embodiments, as a pixel group PXG1 illustrated in
In some example embodiments, as a pixel group PXG3 illustrated in
In some example embodiments, as a pixel group PXG4 illustrated in
Photons incident on the semiconductor substrate 100 may enter the P− region 104, and may generate electron-hole pairs in the P− region 104. That is, the P− region 104 may correspond to a main photo-charge generating region where photo-charges may be mainly generated. Photo-electrons generated as minority carriers may move into a depletion region of an N-P junction at a boundary between the N region 103 and the P− region 104. Since the P+ region 105, which is heavily doped, is located below the P− region 104, the photo charges may tend to reside near the N-P junction. In some example embodiments, the N-region 103 may be replaced with a P region. In some example embodiments, the photo diode as illustrated in
Referring to
The pixel array 620 includes a plurality of pixels 700 coupled to column lines COL, respectively, and the plurality of pixels 700 senses incident light to generate analog signals through the column lines COL. The plurality of pixels 700 may be arranged in matrix form with a plurality of rows and a plurality of columns. The pixel array 620 may have a structure that various unit patterns, which will be described below with reference to
The row driver 630 may be coupled to the rows of the pixel array 620 to generate signals for driving the rows. For example, the row driver 630 may drive the pixels in the pixel array 620 row by row.
The analog-to-digital conversion circuit 640 may be coupled to the columns of the pixel array 620 to convert the analog signals from the pixel array 20 to digital signals. As illustrated in
The analog-to-digital conversion circuit 640 may include a correlated double sampling (CDS) unit. In some example embodiments, the CDS unit may perform an analog double sampling by extracting a valid image component based on a difference between an analog reset signal and an analog image signal. In some example embodiments, the CDS unit may perform a digital double sampling by converting the analog reset signal and the analog image signal to two digital signals and extracting a difference between the two digital signals as the valid image component. In some example embodiments, the CDS unit may perform a dual CDS by performing both the analog double sampling and digital double sampling.
The column driver 650 may output the digital signals from the analog-to-digital conversion circuit 40 sequentially as output data Dout.
The controller 660 may control the row driver 30, the analog-to-digital conversion circuit 640, the column driver 650, and/or the reference signal generator 670. The controller 660 may provide control signals such as clock signals, timing control signals, etc. required for the operations of the row driver 630, the analog-to-digital conversion circuit 640, the column driver 650, and/or the reference signal generator 670. The controller 660 may include a control logic circuit, a phase-locked loop, a timing control circuit, a communication interface circuit, etc.
The reference signal generator 670 may generate a reference signal or a ramp signal that increases or decreases gradually and provide the ramp signal to the analog-to-digital conversion circuit 40.
Referring to
For example, the photodiode PD may include an n-type region in a p-type substrate such that the n-type region and the p-type substrate form a p-n conjunction diode. The photodiode PD receives the incident light and generates a photo-charge based on the incident light. In some example embodiments, the unit pixel 600a may include a phototransistor, a photogate, and/or a pinned photodiode, etc. instead of, or in addition to, the photodiode PD.
The photo-charge generated in the photodiode PD may be transferred to a floating diffusion node FD through the transfer transistor TX. The transfer transistor TX may be turned on in response to a transfer control signal TG.
The drive transistor DX functions as a source follower amplifier that amplifies a signal corresponding to the charge on the floating diffusion node FD. The selection transistor SX may transfer the pixel signal Vpix to a column line COL in response to a selection signal SEL.
The floating diffusion node FD may be reset by the reset transistor RX. For example, the reset transistor RX may discharge the floating diffusion node FD in response to a reset signal RS for correlated double sampling (CDS).
Referring to
At a time t2, the row driver 630 may provide an activated reset control signal RS to the selected row, and the controller 60 may provide an up-down control signal UD having a logic high level to a counter included in the ADC 641. From the time t2, the pixel array 620 may output a first analog signal corresponding to a reset component Vrst as the pixel voltage Vpix.
At a time t3, the controller 660 may provide a count enable signal CNT_EN having a logic high level to the reference signal generator 670, and the reference signal generator 670 may start to decrease the reference signal Vref at the constant rate, e.g., a slope of ‘a’. The controller 660 may provide a count clock signal CLKC to the counter, and the counters may perform down-counting from zero in synchronization with the count clock signal CLKC.
At a time t4, a magnitude of the reference signal Vref may become smaller than a magnitude of the pixel voltage Vpix, and a comparator included in the ADC 641 may provide a comparison signal CMP having a logic low level to the counter so that the counter stops performing the down-counting. At the time t4, a counter output of the counter may be the first counting value that corresponds to the reset component Vrst. In the example of
At a time t5, the controller 660 may provide the count enable signal CNT_EN having a logic low level to the reference signal generator 670, and the reference signal generator 670 may stop generating the reference signal Vref.
A period from the time t3 to the time t5 corresponds to a maximum time for detecting the reset component Vrst. A length of the period from the time t3 to the time t5 may be determined as a certain number of the count clock signal CLKC according to a characteristic of the image sensor 700.
At a time t6, the row driver 630 may provide an activated transfer control signal TG (e.g., the transfer control signal TG having a logic high level) to the selected row, and the controller 660 may provide the up-down control signal UD having a logic low level to the counter. From the time t6, the pixel array 620 may output a second analog signal AS2 corresponding to a detected incident light Vrst+Vsig as the pixel voltage Vpix.
At a time t7, the controller 660 may provide the count enable signal CNT_EN having a logic high level to the reference signal generator 670, and the reference signal generator 670 may start to decrease the reference signal Vref at the same constant rate as at the time t3, e.g., a slope of ‘a’. The comparator may provide the comparison signal CMP having a logic high level to the counter since the pixel voltage Vpix is smaller than the reference signal Vref. The controller 660 may provide the count clock signal CLKC to the counter, and the counter may perform an up-counting from the first counting value, which corresponds to the reset component Vrst, in synchronization with the count clock signal CLKC.
At a time t8, the magnitude of the reference signal Vref may become smaller than the magnitude of the pixel voltage Vpix, and the comparator may provide the comparison signal CMP having a logic low level to the counter so that the counter stops performing the up-counting. At the time t8, the counter output of the counter may correspond to a difference between the first analog signal representing the reset component Vrst (e.g., −2 in the example of
At a time t9, the controller 660 may provide the count enable signal CNT_EN having a logic low level to the reference signal generator 670, and the reference signal generator 670 may stop generating the reference voltage Vref.
A period from the time t7 to the time t9 corresponds to a maximum time for detecting the detected incident light Vrst+Vsig. A length of the period from the time t7 to the time t9 may be determined as a certain number of the count clock signal CLKC according to a characteristic of the image sensor 700.
At a time t10, the row driver 630 may provide a deactivated row selection signal SEL (e.g., the row selection signal having a low level) to the selected row of the pixel array 620, and the counter may reset the counter output to zero.
After that, the image sensor 700 may repeat above described operations on each row to generate the digital signals row by row.
The inventive concepts are not limited to the example configuration and operation described with reference to
Referring to
Control signals TG1˜TG8, RS and DCG may be provided from the row driver (e.g., the row driver 630 in
The first unit pixel PX11 may include a first photodiode PD11, a first vertical transfer gate VTG1 and a second vertical transfer gate VTG2, the second unit pixel PX12 may include a second photodiode PD12, a third vertical transfer gate VTG3 and a fourth vertical transfer gate VTG4, the third unit pixel PX21 may include a third photodiode PD21, a fifth vertical transfer gate VTG5 and a sixth vertical transfer gate VTG6, and the fourth unit pixel PX22 may include a fourth photodiode PD22, a seventh vertical transfer gate VTG7 and a eighth vertical transfer gate VTG8.
The readout circuit 800 may include a reset transistor RX, a gain adjusting transistor GX, a capacitor Cdcg, a source follower transistor or a driving transistor DX, and/or a selection transistor SX.
The reset transistor RX may be connected between a reset voltage VRST and a gain adjusting node Ndcg and the reset transistor RX may be turned on and off in response to a reset signal RS. The gain adjusting transistor GX may be connected between the gain adjusting node Ndcg and the common floating diffusion node FD and the gain adjusting transistor GX may be turned on and off in response to a gain adjusting signal DCG. The capacitor Cdcg may be connected in parallel with the reset transistor RX between the reset voltage VRST and the gain adjusting node Ndcg. As will be described with reference to
Referring to
In some example embodiments, as illustrated in
In some example embodiments, as illustrated in
As such, the pixel array and the image sensor including the pixel group according to example embodiments may more efficiently implement a high dynamic range (HDR) through independent driving of the two vertical transfer gates in each unit pixel.
Referring to
The pixel signal Vpix output from the pixel array may include a shot noise that increases according to an ambient light and a circuit noise caused by characteristics of internal circuits of the pixel array. Even though the gain of the pixel is increased using the gain adjusting transistor GX and the capacitor Cdcg as illustrated in
According to example embodiments, the shot noise and/or the circuit noise of the target color pixels (e.g., the blue color pixels) may be reduced and the sensing sensitivity of the target color pixels may be enhanced.
Hereinafter, example embodiments are described based on structures corresponding to a back-side illumination (BSI) such that a light is incident through a lower surface of a semiconductor substrate, and it will be understood that example embodiments may be applied to structures corresponding to a front-side illumination (FSI) such that a light is incident through an upper surface of a semiconductor substrate.
Referring to
The optically-transparent layer 300 may include first through fourth color filters CF11˜CF22 and first through forth microlenses MLS11˜MLS22 corresponding to the first through fourth unit pixels PX11˜PX22. The optically-transparent layer 30 may be configured to allow external incident light to be filtered and focused on the semiconductor substrate 100. The color filters CF11˜CF22 and the microlenses MLS11˜MSL22 may be provided on the lower surface 100b of the semiconductor substrate 100. A first flattening layer 310 may be disposed between the lower surface of the semiconductor substrate 100 and the color filters CF11˜CF22, and a second flattening layer 320 may be disposed between the color filters CF11˜CF22 and the microlens MLS11˜MLS22.
The color filters CF11˜CF22 may include one of red, green, and blue filters. In some example embodiments, the color filters CF11˜CF22 may include one of cyan, magenta, and yellow filters. When the unit pixels PX11˜PX22 in the pixel group PXG correspond to the same color, the unit pixels PX11˜PX22 may share the single color filter CF as illustrated in
Referring to
The image sensor may perform auto focusing to adjust a focus of a device including the image sensor based on difference between the electrical signals from the photoelectric conversion elements PD11˜PD22 that share the common microlens CMLS.
As such, the pixel array and the image sensor according to example embodiments may implement an auto focusing function and also enhance image quality by reducing cross-talk between the unit pixels using the plurality of unit pixels sharing the common microlens and trench structures extending from the upper surface of the semiconductor substrate to the lower surface of the semiconductor substrate.
The trench structures 400 and 500 may be formed of an insulating material having a refractive index lower than that of the semiconductor substrate 100 (e.g., of silicon), and may include one or more insulating layers. For example, the trench structure 400 and 500 may be formed of or include at least one of a silicon oxide layer, a silicon nitride layer, an undoped poly-silicon layer, air, or combinations thereof. The formation of the trench structures 400 and 500 may include removing portions of the upper surface 100a and/or the lower surface 100b of the semiconductor substrate 100 to form a deep trench and filling the deep trench with an insulating material.
As described above, the trench structures 400 and 500 may be disposed in the semiconductor substrate 100 and extend in the vertical direction DR3 from the upper surface 100a of the semiconductor substrate 100 and to the lower surface 100b of the semiconductor substrate 100 to electrically and optically separate the photoelectric conversion elements PD11˜PD22 from each other.
In some example embodiments, as a pixel group PXG7 illustrated in
In some example embodiments, as a pixel group PXG8 illustrated in
In some example embodiments, the upper trench structure 400t and 500t has a structure or a composition different from the lower trench structure 400b and 500b. For example, as a pixel group PXG3 illustrated in
Referring to
In some example embodiments, as illustrated in
In some example embodiments, as illustrated in
Referring to
In some example embodiments, all of the unit patterns UPTT in the pixel array 620 may be identical. In some example embodiments, the unit pattern UPTT is a minimum pattern that cannot be divided into smaller patterns. In some example embodiments, the unit patterns UPTT in the pixel array 620 may include two or more different patterns such that the different patterns are arranged regularly in the first horizontal direction DR1 and/or the second horizontal direction DR2.
Hereinafter, various color filter array and unit patterns according to example embodiments are described with reference to
In
Referring to
In some example embodiments, as the unit pattern UPTT1 illustrated in
In some example embodiments, as the unit pattern UPTT2 illustrated in
In some example embodiments, as the unit pattern UPTT3 illustrated in
In some example embodiments, as the unit pattern UPTT4 illustrated in
In some example embodiments, as the unit pattern UPTT51 illustrated in
In some example embodiments, as the unit pattern UPTT6 illustrated in
In some example embodiments, as the unit pattern UPTT7 illustrated in
Referring to
In some example embodiments, as the unit pattern UPTT8 illustrated in
In some example embodiments, as the unit pattern UPTT9 illustrated in
In some example embodiments, as the unit pattern UPTT10 illustrated in
Referring to
In some example embodiments, as illustrated in
In some example embodiments, as illustrated in
In comparison with the unit patterns UPTT1 and UPTT8 of
Referring to
In some example embodiments, as the unit pattern UPTT13 illustrated in
In some example embodiments, as the unit pattern UPTT14 illustrated in
In some example embodiments, as the unit pattern UPTT15 illustrated in
In comparison with the unit patterns UPTT1 and UPTT8 of
Referring to
The camera module group 1100 may include a plurality of camera modules 1100a, 1100b and 1100c.
Hereinafter, an example configuration of the camera module 1100b is described with reference to
Referring to
The prism 1105 may include a reflection surface 1107 to change a path of a light L incident on the prism 1105.
In some example embodiments, the prism 1105 may change the path of the light L incident in a first direction X to the path in a second direction Y perpendicular to the first direction X. In addition, the prism 1105 may rotate the reflection surface 1107 around a center axis 1106 and/or rotate the center axis 1106 in the B direction to align the path of the reflected light along the second direction Y. In addition, the OPFE 1110 may move in a third direction perpendicular to the first direction X and the second direction Y.
In some example embodiments, a rotation angle of the prism 1105 may be smaller than 15 degrees in the positive (+) A direction and greater than 15 degrees in the negative (−) A direction, but example embodiments are not limited thereto.
In some example embodiments, the prism 1105 may rotate within 20 degrees in the positive B direction and the negative B direction.
In some example embodiments, the prism 1105 may move the reflection surface 1106 in the third direction Z that is in parallel with the center axis 1106.
The OPFE 1110 may include optical lenses that are divided into m groups where m is a positive integer. The m lens group may move in the second direction Y to change an optical zoom ratio of the camera module 1100b. For example, the optical zoom ratio may be changed in a range of 3K, 5K, and so on by moving the m lens group, when K is a basic optical zoom ratio of the camera module 1100b.
The actuator 1130 may move the OPFE 1110 or the optical lens to a specific position. For example, the actuator 1130 may adjust the position of the optical lens for accurate sensing such that an image sensor 1142 may be located at a position corresponding to a focal length of the optical lens.
The image sensing device 1140 may include the image sensor 1142, a control logic 1144 and/or a memory 1146. The image sensor 1142 may capture or sense an image using the light provided through the optical lens. The control logic 1144 may control overall operations of the camera module 1100b. For example, the control logic 1144 may provide control signals through control signal line CSLb to control the operation of the camera module 1100b.
The memory 1146 may store information such as calibration data 1147 for the operation of the camera module 1100b. For example, the calibration data 1147 may include information for generation of image data based on the provided light, such as information on the above-described rotation angle, a focal length, information on an optical axis, and so on. When the camera module 1100b is implemented as a multi-state camera having a variable focal length depending on the position of the optical lens, the calibration data 1147 may include multiple focal length values and auto-focusing values corresponding to the multiple states.
The storage device 1150 may store the image data sensed using the image sensor 1142. The storage device 1150 may be disposed outside of the image sensing device 1140, and the storage device 1150 may be stacked with a sensor chip comprising the image sensing device 1140. The storage device 1150 may be implemented with an electrically erasable programmable read-only memory (EEPROM), but example embodiments are not limited thereto.
Referring to
In some example embodiments, one camera module 1100b may have a folded lens structure included the above-described prism 1105 and the OPFE 1110, and the other camera modules 1100a and 1100b may have a vertical structure without the prism 1105 and the OPFE 1110.
In some example embodiments, one camera module 1100c may be a depth camera configured to measure distance information of an object using an infrared light. In some example embodiments, the application processor 1200 may merge the distance information provided from the depth camera 1100c and image data provided from the other camera modules 1100a and 1100b to generate a three-dimensional depth image.
In some example embodiments, at least two camera modules among the camera modules 1100a, 1100b and 1100c may have different field of views, for example, through different optical lenses.
In some example embodiments, each of the camera modules 1100a, 1100b and 1100c may be separated physically from each other. In other words, the camera modules 1100a, 1100b and 1100c may each include a dedicated image sensor 1142.
The application processor 1200 may include an image processing device 1210, a memory controller 1220 and an internal memory 1230. The application processor 1200 may be separated from the camera modules 1100a, 1100b and 1100c. For example, the application processor 1200 may be implemented as one chip and the camera modules 1100a, 1100b and 1100c may implemented as another chip or other chips.
The image processing device 1210 may include a plurality of sub processors 1212a, 1212b and 1212c, an image generator 1214 and a camera module controller 1216.
The image data generated by the camera modules 1100a, 1100b and 1100c may be provided to the sub processors 1212a, 1212b and 1212c through distinct image signal lines ISLa, ISLb and ISLc, respectively. For example, the transfer of the image data may be performed using a camera serial interface (CSI) based on the mobile industry processor interface (MIPI), but example embodiments are not limited thereto.
In some example embodiments, one sub processor may be assigned commonly to two or more camera modules. In some example embodiments, a multiplexer may be used to transfer the image data selectively from one of the camera modules to the shared sub processor.
The image data from the sub processors 1212a, 1212b and 1212c may be provided to the image generator 1214. The image generator 1214 may generate an output image using the image data from the sub processors 1212a, 1212b and 1212c according to image generating information or a mode signal. For example, the image generator 1213 may merge at least a portion of the image data from the camera modules 1100a, 1100b and 1100c having the different fields of view to generate the output image according to the image generating information or the mode signal. In addition, the image generator 1214 may select, as the output image, one of the image data from the camera modules 1100a, 1100b and 1100c according to the image generating information or the mode signal.
In some example embodiments, the image generating information may include a zoom factor or a zoom signal. In some example embodiments, the mode signal may be a signal based on a selection of a user.
When the image generating information is the zoom factor and the camera modules 1100a, 1100b and 1100c have the different field of views, the image generator 1214 may perform different operation depending on the zoom signal. For example, when the zoom signal is a first signal, the image generator 1214 may merge the image data from the different camera modules to generate the output image. When the zoom signal is a second signal different from the first signal, the image generator 1214 may select, as the output image, one of image data from the camera modules 1100a, 1100b and 1100c.
In some example embodiments, the image generator 1214 may receive the image data of different exposure times from the camera modules 1100a, 1100b and 1100c. In some example embodiments, the image generator 1214 may perform high dynamic range (HDR) processing with respect to the image data from the camera modules 1100a, 1100b and 1100c to generate the output image having the increased dynamic range.
The camera module controller 1216 may provide control signals to the camera modules 1100a, 1100b and 1100c. The control signals generated by the camera module controller 1216 may be provided to the camera modules 1100a, 1100b and 1100c through the distinct control signal lines CSLa, CSLb and CSLc, respectively.
In some example embodiments, one of the camera modules 1100a, 1100b and 1100c may be designated as a master camera according to the image generating information of the mode signal, and the other camera modules may be designated as slave cameras.
The camera module acting as the master camera may be changed according to the zoom factor or an operation mode signal. For example, when the camera module 1100a has the wider field of view than the camera module 1100b and the zoom factor indicates a lower zoom magnification, the camera module 1100b may be designated as the master camera. In contrast, when the zoom factor indicates a higher zoom magnification, the camera module 1100a may be designated as the master camera.
In some example embodiments, the control signals provided from the camera module controller 1216 may include a synch enable signal. For example, when the camera module 1100b is the master camera and the camera modules 1100a and 1100c are the slave cameras, the camera module controller 1216 may provide the synch enable signal to the camera module 1100b. The camera module 1100b may generate a synch signal based on the provided synch enable signal and provide the synch signal to the camera modules 1100a and 1100c through a synch signal line SSL. As such, the camera modules 1100a, 1100b and 1100c may transfer the synchronized image data to the application processor 1200 based on the synch signal.
In some example embodiments, the control signals provided from the camera module controller 1216 may include information on the operation mode. The camera modules 1100a, 1100b and 1100c may operate in a first operation mode or a second operation mode based on the information from the camera module controller 1216.
In the first operation mode, the camera modules 1100a, 1100b and 1100c may generate image signals with a first speed (e.g., a first frame rate) and encode the image signals with a second speed higher than the first speed (e.g., a second frame rate higher than the first frame rate) to transfer the encoded image signals to the application processor 1200. The second speed may be lower than thirty times the first speed. The application processor 1200 may store the encoded image signals in the internal memory 1230 or the external memory 1400. The application processor 1200 may read out and decode the encoded image signals to provide display data to a display device. For example, the sub processors 1212a, 1212b and 1212c may perform the decoding operation and the image generator 1214 may process the decoded image signals.
In the second operation mode, the camera modules 1100a, 1100b and 1100c may generate image signals with a third speed lower than the first speed (e.g., the third frame rate lower than the first frame rate) to transfer the generated image signals to the application processor 1200. In other words, the image signals that are not encoded may be provided to the application processor 1200. The application processor 1200 may process the received image signals or store the receive image signals in the internal memory 1230 or the external memory 1400.
The PMIC 1300 may provide a power supply voltage to the camera modules 1100a, 1100b and 1100c, respectively. For example, the PMIC 1300 may provide, under control of the application processor 1200, a first power to the camera module 1100a through a power line PSLa, a second power to the camera module 1100b through a power line PSLb, and a third power to the camera module 1100c through a power line PSLc.
The PMIC 1300 may generate the power respectively corresponding to the camera modules 1100a, 1100b and 1100c and control power levels, in response to a power control signal PCON from the application processor 1200. The power control signal PCON may include information on the power depending on the operation modes of the camera modules 1100a, 1100b and 1100c. For example, the operation modes may include a low power mode in which the camera modules 1100a, 1100b and 1100c operate in low powers. The power levels of the camera modules 1100a, 1100b and 1100c may be the same as or different from each other. In addition, the power levels may be changed dynamically or adaptively.
As described above, the pixel array and the image sensor including the pixel group according to example embodiments may enhance an image quality by increasing sensing sensitivity of the unit pixel through the shared structure of the floating diffusion region and the symmetric structure of the vertical transfer gates. In addition, the pixel group according to example embodiments may reduce cross-talk between the unit pixels and further enhance the image quality through the trench structure extending in the vertical direction from the upper surface of the semiconductor substrate and to the lower surface of the semiconductor substrate. In addition, the pixel array and the image sensor including the pixel group according to example embodiments may more efficiently implement a high dynamic range (HDR) through independent driving of the two vertical transfer gates in each unit pixel.
One or more of the elements disclosed above may include or be implemented in one or more processing circuitries such as hardware including logic circuits; a hardware/software combination such as a processor executing software; or a combination thereof. For example, the processing circuitries more specifically may include, but is not limited to, a central processing unit (CPU), an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA), a System-on-Chip (SoC), a programmable logic unit, a microprocessor, application-specific integrated circuit (ASIC), etc.
The example embodiments may be applied to any electronic devices and systems including an image sensor. For example, the example embodiments may be applied to systems such as a mobile phone, a smart phone, a personal digital assistant (PDA), a portable multimedia player (PMP), a digital camera, a camcorder, a personal computer (PC), a server computer, a workstation, a laptop computer, a digital TV, a set-top box, a portable game console, a navigation system, a wearable device, an internet of things (IoT) device, an internet of everything (IoE) device, an e-book, a virtual reality (VR) device, an augmented reality (AR) device, an augmented reality (AR) device, a vehicle navigation device, a video phone, a monitoring system, an auto focusing system, a tracking system, a motion detection system, etc.
The foregoing is illustrative of example embodiments and is not to be construed as limiting thereof. Although a few example embodiments have been described, those skilled in the art will readily appreciate that many modifications are possible in the example embodiments without materially departing from the example embodiments.
Lee, Seungjoon, Kim, Jihun, Lee, Kyungho, Yun, Jungbin
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10177192, | Jul 04 2016 | SK Hynix Inc. | Image sensor having photodiodes sharing one color filter and one micro-lens |
10586818, | Aug 22 2011 | Sony Corporation | Solid-state imaging device, camera module and electronic apparatus |
10741593, | May 24 2019 | OmniVision Technologies, Inc. | Vertical transfer gate storage for a global shutter in an image sensor |
10748958, | Jun 09 2008 | Sony Corporation | Solid-state imaging device, drive method thereof and electronic apparatus |
9609250, | Aug 19 2014 | SAMSUNG ELECTRONICS CO , LTD | Unit pixels for image sensors and pixel arrays comprising the same |
9748299, | Aug 06 2014 | Samsung Electronics Co., Ltd. | Pixel, image sensor including the same, and portable electronic device including the image sensor |
20180294297, | |||
20190104261, | |||
20210020674, | |||
KR1020180004480, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 13 2021 | LEE, SEUNGJOON | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057118 | /0542 | |
Jul 13 2021 | YUN, JUNGBIN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057118 | /0542 | |
Jul 13 2021 | KIM, JIHUN | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057118 | /0542 | |
Jul 15 2021 | LEE, KYUNGHO | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057118 | /0542 | |
Aug 03 2021 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Aug 03 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Apr 25 2026 | 4 years fee payment window open |
Oct 25 2026 | 6 months grace period start (w surcharge) |
Apr 25 2027 | patent expiry (for year 4) |
Apr 25 2029 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 25 2030 | 8 years fee payment window open |
Oct 25 2030 | 6 months grace period start (w surcharge) |
Apr 25 2031 | patent expiry (for year 8) |
Apr 25 2033 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 25 2034 | 12 years fee payment window open |
Oct 25 2034 | 6 months grace period start (w surcharge) |
Apr 25 2035 | patent expiry (for year 12) |
Apr 25 2037 | 2 years to revive unintentionally abandoned end. (for year 12) |