An imaging device capable of phase difference focus detection is described. The imaging device includes a plurality of pixels that are 2-dimensionally arranged and which receive image light. At least one pixel of the plurality of pixels comprises: a micro lens; a plurality of photoelectric conversion units, which are biased around an optical axis of the micro lens; and a control unit, which limits generation of electrons photoelectrically converted at at least one photoelectric conversion unit of the plurality of photoelectric conversion units.
|
1. An imaging device comprising:
a plurality of pixels that are 2-dimensionally arranged and which receive image light,
wherein at least one pixel of the plurality of pixels comprises:
a micro lens;
a plurality of photoelectric conversion units, which are biased around an optical axis of the micro lens; and
a control unit, which limits generation of electrons photoelectrically converted at at least one photoelectric conversion unit of the plurality of photoelectric conversion units.
12. An imaging device comprising:
a plurality of pixels that are 2-dimensionally arranged and which receive image light,
wherein at least one pixel of the plurality of pixels comprises:
a micro lens;
a plurality of photoelectric conversion units, which are biased around an optical axis of the micro lens; and
a control unit, which selects a first output mode for outputting electrons photoelectrically converted at the plurality of photoelectric conversion units or a second output mode for outputting only electrons photoelectrically converted at one of the plurality of photoelectric conversion units;
wherein generation of electrons photoelectrically converted at at least one photoelectric conversion unit is limited in the second output mode.
2. The imaging device of
the control unit limits generation of photoelectrically converted electrons by changing an electric potential of at least one photo diode of the plurality of photo diodes.
3. The imaging device of
the control unit limits generation of photoelectrically converted electrons by changing a gate electric potential of an electron generating unit of at least one photo transistor of the plurality of photo transistors.
4. The imaging device of
the control unit comprises a reset unit for discharging electrons generated by the plurality of photo diodes and limits generation of photoelectrically converted electrons by discharging electrons generated by at least one photo diode of the plurality of photo diodes.
5. The imaging device of
the reset unit is independent from an output unit.
6. The imaging device of
7. The imaging device of
8. The imaging device of
9. The imaging device of
10. The imaging device of
11. The imaging device of
13. The imaging device of
the control unit selects the second output mode for a phase difference focusing operation.
14. The imaging device of
15. The imaging device of
16. The imaging device of
17. The imaging device of
18. The imaging device of
19. The imaging device of
20. The imaging device of
wherein the control unit selects the second output mode for the pixels arranged in the horizontal direction when the pixels arranged in the vertical direction correspond to the first output mode; and
wherein the control unit selects the second output mode for the pixels arranged in the vertical direction when the pixels arranged in the horizontal direction correspond to the first output mode.
21. The imaging device of
22. The imaging device of
the plurality of pixels, each of which includes a respective plurality of photoelectric conversion units, comprise the plurality of photoelectric conversion units biased in a horizontal direction and a vertical direction.
|
This application claims the priority benefit of Korean Patent Application No. 10-2013-0059261, filed on May 24, 2013, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
1. Technical Field
The present disclosure relates to an imaging sensor, and more particularly, to an imaging sensor capable of detecting a phase difference of focus.
2. Related Art
In a digital photographing apparatus, such as a camera or a camcorder, it is desirable to precisely set focus on an object to capture a clear still image or a clear moving picture. Examples of auto focus (AF) adjusting mechanisms for automatically adjusting focus include a contrast AF and a phase difference AF.
The contrast AF is a mechanism for acquiring contrast values with respect to image signals that are generated by an imaging sensor while position of a focus lens is being changed and moving the focus lens to a position corresponding to the peak contrast value.
The phase difference AF is a mechanism that employs a separate sensing device and detects a focal point based on phase difference of lights applied to the sensing device.
The phase difference AF is generally faster and more precise than the contrast AF. However, the phase difference AF requires a mirror for detecting a focal point, thereby increasing a size of a photographing device employing the phase difference AF. Furthermore, it may be difficult to detect a focal point while images are successively being captured.
Therefore, to resolve the problem, a method of performing phase difference AF without a mirror by arranging phase difference detecting pixels capable of performing phase difference AF at an imaging sensor has been introduced.
However, outputs of phase difference pixels arranged between imaging pixels significantly differ from those of the remaining pixels. Therefore, the phase difference pixels are considered as defective pixels in an output image and causes deterioration of captured images. The same problem may occur even if phase difference detecting pixels are used as imaging pixels.
Various embodiments include an imaging device capable of phase difference focus detection, in which phase difference detecting pixels may be used as imaging pixels, wherein the phase difference detecting pixels are capable of detecting phase difference and capturing an image without deterioration of image quality.
Embodiments also include an imaging device capable of phase difference focus detection, in which pixels may be switched between phase difference detecting pixels and imaging pixels.
Embodiments also include an imaging device capable of phase difference focus detection, in which charge output of a photo diode or a photo transistor is limited or charge generation at a photoelectric conversion unit is limited when pixels are switched between phase difference detecting pixels and imaging pixels.
In an embodiment, an imaging device includes a plurality of pixels that are 2-dimensionally arranged and which receive image light. At least one pixel of the plurality of pixels includes: a micro lens; a plurality of photoelectric conversion units, which are biased around an optical axis of the micro lens; and a control unit, which limits generation of electrons photoelectrically converted at at least one photoelectric conversion unit of the plurality of photoelectric conversion units.
The plurality of photoelectric conversion units may include a plurality of photo diodes. The control unit may limit generation of photoelectrically converted electrons by changing electric potential of at least one photo diode of the plurality of photo diodes.
The plurality of photoelectric conversion units may include a plurality of photo transistors. The control unit may limit generation of photoelectrically converted electrons by changing a gate electric potential of an electron generating unit of at least one photo transistor of the plurality of photo transistors.
The plurality of photoelectric conversion units may include a plurality of photo diodes. The control unit may include a reset unit for discharging electrons generated by the photo diodes and may limit generation of photoelectrically converted electrons by discharging electrons generated by at least one photo diode of the plurality of photo diodes.
The reset unit may include a reset circuit for discharging electrons. The reset unit may be independent from an output unit.
At least one pixel, of the plurality of pixels, that includes the plurality of photoelectric conversion units may be arranged only at a particular region of the imaging device.
From the pixels arranged only at the particular region, pixels of which the photoelectric conversion units are biased in a same direction may be arranged in the same direction as the direction in which the corresponding photoelectric conversion units are biased.
Pixels of the plurality of pixels that include photoelectric conversion units biased in a horizontal direction may be arranged at the imaging device in the horizontal direction.
Pixels of the plurality of pixels that include photoelectric conversion units biased in a vertical direction may be arranged at the imaging device in the vertical direction.
Each pixel of the plurality of pixels included in the imaging device may include a respective plurality of the photoelectric conversion units.
The plurality of pixels, each of which includes the respective plurality of photoelectric conversion units, includes the plurality of photoelectric conversion units biased in a horizontal direction and a vertical direction.
According to another embodiment, an imaging device includes a plurality of pixels that are 2-dimensionally arranged and which receive image light. At least one pixel of the plurality of pixels includes: a micro lens; a plurality of photoelectric conversion units, which are biased around an optical axis of the micro lens; and a control unit, which selects a first output mode for outputting electrons photoelectrically converted at the plurality of photoelectric conversion units or a second output mode for outputting only electrons photoelectrically converted at one of the plurality of photoelectric conversion units. Generation of electrons photoelectrically converted at at least one photoelectric conversion unit is limited in the second output mode.
The control unit may select the first output mode for an imaging operation. The control unit may select the second output mode for a phase difference focusing operation.
Electrons photoelectrically converted at the plurality of photoelectric conversion units may be combined and output in the first output mode.
The at least one pixel may further include a read-out unit which outputs electrons photoelectrically converted at the plurality of photoelectric conversion units.
The read-out unit may include a plurality of read-out transistors for selectively outputting the photoelectrically converted electrons from the plurality of photoelectric conversion units.
Only electrons photoelectrically converted at one photoelectric conversion unit of the plurality of photoelectric conversion units may be output by selectively operating the plurality of read-out transistors in the second output mode.
The at least one pixel that includes the plurality of photoelectric conversion units may be arranged only at a particular region of the imaging device.
From the at least one pixel arranged only at the particular region, pixels of which the plurality of photoelectric conversion units are biased in a same direction are arranged in the same direction as the direction in which the corresponding photoelectric conversion units are biased.
The at least one pixel that includes the plurality of photoelectric conversion units is arranged at the imaging device in a horizontal direction and a vertical direction. The control unit may select the second output mode for the pixels arranged in the horizontal direction when the pixels arranged in the vertical direction correspond to the first output mode. The control unit may select the second output mode for the pixels arranged in the vertical direction when the pixels arranged in the horizontal direction correspond to the first output mode.
Pixels at points where the pixels arranged in the horizontal direction and the pixels arranged in the vertical direction intersect may include a plurality of photoelectric conversion units biased in the horizontal direction and the vertical direction.
Each pixel of the plurality of pixels included in the imaging device may include the plurality of photoelectric conversion units. The plurality of pixels, each of which includes a respective plurality of photoelectric conversion units, may include the plurality of photoelectric conversion units biased in a horizontal direction and a vertical direction.
The above and other embodiments of the present disclosure will become more apparent by describing in detail various embodiments thereof with reference to the attached drawings in which:
For the purposes of promoting an understanding of the principles of the invention, reference has been made to the embodiments illustrated in the drawings, and specific language has been used to describe these embodiments. However, no limitation of the scope of the invention is intended by this specific language, and the invention should be construed to encompass all embodiments that would normally occur to one of ordinary skill in the art. The terminology used herein is for the purpose of describing the particular embodiments and is not intended to be limiting of exemplary embodiments of the invention. In the description of the embodiments, certain detailed explanations of related art are omitted when it is deemed that they may unnecessarily obscure the essence of the invention.
While such terms as “first,” “second,” etc., may be used to describe various components, such components must not be limited to the above terms. The above terms are used only to distinguish one component from another.
The terms used in the present specification are merely used to describe particular embodiments, and are not intended to limit the invention. An expression used in the singular encompasses the expression of the plural, unless it has a clearly different meaning in the context. In the present specification, it is to be understood that the terms such as “including” or “having,” etc., are intended to indicate the existence of the features, numbers, steps, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, steps, actions, components, parts, or combinations thereof may exist or may be added.
Various embodiments will be described below in more detail with reference to the accompanying drawings. Those components that are the same or are in correspondence are rendered the same reference numeral regardless of the figure number, and redundant explanations are omitted. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
Referring to
The lens 1000 of the electronic apparatus 100 includes an imaging lens 101 including a focus lens 102. The electronic apparatus 100 may perform a focus detecting function for driving the focus lens 102. The lens 1000 includes a lens driving unit 103 for driving the focus lens 102, a lens position detecting unit 104 for detecting position of the focus lens 102, and a lens control unit 105 for controlling the focus lens 102. The lens control unit 105 exchanges information regarding focus detection with a CPU 106 of the electronic apparatus 100 via an interface 129.
The electronic apparatus 100 includes an imaging device 108 and generates image signals by capturing light from an object transmitted through the imaging lens 101. The imaging device 108 may include a plurality of photoelectric conversion units (not shown) and a transmission path (not shown) for reading out image signals by moving electrons from the photoelectric conversion units.
The imaging device control unit 107 generates timing signals for the imaging device 108 to capture images. Furthermore, the imaging device control unit 107 sequentially reads out image signals after electrons are accumulated at respective scan lines.
The read out image signals are converted to digital signals by the analog signal processing unit 109 and the analog-to-digital (A/D) converting unit 110 and are input to an image input controller 111 and processed thereby.
Digital image signals that are input to the image input controller 111 are processed by an auto white balance (AWB) detecting unit 116, an auto exposure (AE) detecting unit 117, and an AF detecting unit 118 for AWB calculation, AE calculation, and AF calculation, respectively. Here, the AF detecting unit 118 outputs detected values regarding contrast values during contrast AF and outputs pixel information to the CPU 106 for phase difference calculation during phase difference AF. The CPU 106 may perform phase difference calculation by performing correlation of a plurality of pixel column signals. As a result, a position of focus or a direction of focus may be calculated.
Image signals are also stored in a synchronous dynamic random access memory (SDRAM) or memory 119. A digital signal processing unit 112 generates displayable live view images or captured images by performing a series of image signal processes, such as gamma correction. A compression/decompression unit 113 compresses image signals or decompresses compressed image signals for playback according to compression formats, such as JPEG compression format or H.264 compression format. An image file including image signals compressed by the compression/decompression unit 113 may be transmitted to a memory card 122 via a memory controller 121 and is stored in the memory card 122. Data regarding images to be displayed is stored in a video random access memory (VRAM) 120, and the images to be displayed are displayed on a liquid crystal display (LCD) or other display unit 115 via a display controller 114. The CPU 106 controls the overall operations of one or more of the components stated above. An electrically erasable programmable read-only memory (EEPROM) 123 stores and maintains data for correcting pixel defects of the imaging device 108 or adjustment data. An operating console 124 receives inputs of various commands from a user for operating the electronic apparatus 100. The operating console 124 may include various buttons, such as a shutter release button, a main button, a mode dial, and a menu button. The electronic apparatus 100 may also include an auxiliary light control unit 125.
Light from an object transmitted through the imaging lens 101 passes through a micro lens array 14 and is guided to light receiving pixels R 15 and L 16. Light screens 17 and 18 or limited apertures for limiting pupils 12 and 13 from the imaging lens 101 are arranged at portions of the light receiving pixels R 15 and L 16. Furthermore, light from the pupil 12 above the optical axis 10 of the imaging lens 101 is guided to the light receiving pixel L 16, whereas light from the pupil 13 below the optical axis 10 of the imaging lens 101 is guided to the light receiving pixel R 15. Guiding lights inversely projected at the pupils 12 and 13 by the micro lens array 14 to the light receiving pixels R 15 and L 16 is referred to as pupil division.
Continuous output of the light receiving pixels R 15 and L 16 by pupil division by the micro lens array 14 exhibits a same shape, but exhibits different phases with respect to position. The reason thereof is that image formation positions of light from the eccentrically formed pupils 12 and 13 of the imaging lens 101 are different from each other. Thus, when focus points of light from the eccentrically formed pupils 12 and 13 are inconsistent with each other, the light receiving pixels R 15 and L 16 exhibit different output phases. On the other hand, when focus points of light from the eccentric pupils 12 and 13 are consistent with each other, images are formed at a same position. In addition, a direction of focus may be determined from the focus difference.
For example, in a front focus state, the phase of the output of the light receiving pixel R 15 is shifted further to the left than that in a focused phase, and the phase of the output of the light receiving pixel L 16 is shifted further to the right than that in the focused phase. In contrast, a back-focusing indicates that an object is in a back focus state. In this case, the phase of the output of the light receiving pixel R 15 is shifted further to the right than that in the focused phase, and the phase of the output of the light receiving pixel L 16 is shifted further to the left than that in the focused phase. The shift amount between the phases of the light receiving pixels R 15 and L 16 may be converted to a deviation amount between the focuses.
However, in the above-stated structure of
Referring to
Referring to
Referring to
The photo transistor 46 also receives and photoelectrically converts light. Furthermore, a phase difference detecting pixel control line 42 may be connected to a gate electrode of the photo transistor 46. Here, the phase difference detecting pixel control line 42 may be a control line for controlling gate current of the photo transistor 46 and to turn photoelectric conversion output on and off.
Referring to
Referring to
Referring to
Description of the photo transistor for controlling a gate of an N-type substrate shown in
Referring to
Since the photo transistor for controlling a gate of an N-type substrate shown in
Referring to
The N-type layers 1112 and 1114, the N-type floating diffusion layer FD(111), and the transmission gate TG implement a MOS transistor Tr1, whereas the N-type floating diffusion layer FD(111), the N-type diffusion layer D(110), and the transmission gate RS(103) implement a MOS transistor Tr2. Next, a gate of a MOS transistor T3 is connected to the N-type floating diffusion layer FD(111). Electrons generated by photo diodes are amplified by the MOS transistor T3 through a voltage potential VPD(101) and, when it is determined by a gate 1108 of MOS transistor T4 to output pixels, the electrons are output via a vertical output line LV 118. In this case, each pixel of an imaging device according to an embodiment may include two photo diodes and four transistors, where a control electrode PX(105) is connected to an end of a photo diode to change electric potential, such that electrons generated by the photo diode are not transmitted to the N-type floating diffusion layer FD(111). By controlling the control electrode PX(105), when a pixel is detecting a phase difference, only electrons generated by one photo diode may be output.
The photo diodes 1121 and 1122 include a common read-out unit 1123. Transmission transistors Tr21 and Tr22 are arranged between the photo diodes 1121 and 1122 and the common read-out unit 1123 and connected to a wiring to a transmission signal line T1(126). Here, the photo diodes 1121 and 1122 are arranged, such that openings thereof have a same area. Therefore, although the photo diode 1122 is larger than the photo diode 1121, portions other than the opening is blocked from light, and thus both the photo diode 1121 and the photo diode 1122 are set to a same sensitivity. Next, an electron control unit 1124 is arranged at a light-blocking portion of the photo diode 1122 and a control line PX(125) is connected. If the pixel 1120 is to be used as a phase difference detecting pixel, the electron control unit 1124 may be turned on to prevent generation and output of electrons. If the pixel 1120 is used as an imaging pixel, the electron control unit 1124 may be turned off, such that both the photo diode 1121 and the photo diode 1122 may output electrons.
Referring to
An amplified signal is transmitted from a terminal 1138 of a read-out selecting transistor Tr51 arranged between image signal read-out lines V(139) to an output line LV(140) and is output as a pixel output. Furthermore, at the read-out unit 1133 shared by the left pixel 1120 and the right pixel 1130, a reset transistor Tr61 may be arranged between the terminal 1141 of the output line 1140 and a reset line RS(142), and thus electrons of the two pixels 1120 and 1130 may be simultaneously discharged. If the pixel 1120 and the pixel 1130 are used as phase difference detecting pixels, the electron control unit 1124 and the electron control unit 1134 may be controlled by control signals from the control line PX(125), and thus the pixel 1120 and the pixel 1130 may be simultaneously controlled. Furthermore, an imaging device may perform phase difference detection using 2-dimensionally arranged units, each of which includes the pixel 1120 and the pixel 1130. Although the pixel 1120 and the pixel 1130 are horizontally connected to each other as illustrated, the present disclosure is not limited thereto, and the pixel 1120 and the pixel 1130 may be vertically or diagonally arranged in an imaging device.
In
Referring to
Referring to
According to an embodiment, when the pixel 160 is used as a phase difference detecting pixel, only electrons from the photo diode 162 may be output by preventing electron output of the photo diode 161 by turning off the transmission signal line TL1(164).
When the pixel 160 is used as an imaging pixel, electrons from both the photo diodes 161 and 162 may be output by turning on the transmission signal line TL1(164). However, the present embodiment is not limited thereto, and a particular photo diode to output electrons may be selected as an occasion demands. Therefore, this embodiment allows for flexibility of configuration of the phase difference detecting pixel.
Referring to
Furthermore, outputs from the two photo diodes 233 and 234, that is, electron output from the pixel 231 and electron output from the pixel 232 may include the common read-out unit 235 and common read-out unit 238 (for photo diodes PD1 and PD2 of pixel 232) and transmission transistors TR83 and Tr87. An electron output line 239 is connected to electron output units 235 and 238 of the pixels 231 and 232 and is connected to an amplification transistor TR88 shared by the left pixel 231 and the right pixel 232. Outputs of the pixels 231 and 232 are output via one of selected transmission lines T1(126) and T2.
An amplified signal is connected to an output line LV via a terminal 240 of a read-out transistor TR89 arranged between the image signal read-out line V and is output as a pixel output. An imaging device may perform phase difference detection with units, in which the two pixels 231 and 232 are combined with each other, arranged in a 2-dimensional shape. Although the two pixels 231 and 232 are horizontally connected to each other, the present disclosure is not limited thereto, and pixels may be arranged vertically or diagonally in an imaging device.
The pixel 170 includes: a common read-out unit 180 for the photoelectric conversion units 171, 172, 173, and 174; a transmission transistor Tr91 configured between the common read-out unit 180, the photoelectric conversion unit 171, and a transmission signal line TU1(186); a transmission transistor Tr92 configured between the common read-out unit 180, the photoelectric conversion unit 172, and the transmission signal line TU1(186); a transmission transistor Tr93 configured between the common read-out unit 180, the photoelectric conversion unit 173, and a transmission signal line TD1(187); and a transmission transistor Tr94 configured between the common read-out unit 180, the photoelectric conversion unit 174, and the transmission signal line TD1(187). An output of the phase difference detecting pixel 170 may be output via one of the transmission signal line TU1(186) or the transmission signal line TD1(187).
The common read-out unit 180 is connected to an electron output line 181, wherein a front end of the common read-out unit 180 is connected to a terminal 182 of amplification transistor Tr95. Therefore, an output of a phase difference detecting pixel is amplified by the transmission transistor Tr95. The amplified signal is output from a terminal 183 of a read-out selection transistor Tr96 arranged at a portion of the image signal reading line V184 via an output line LV(185). Furthermore, the terminal 182 of the common electron output line 181 includes a reset transistor Tr97 configured between a terminal 188 of the output line LV(185) and a reset line RS(189). The reset transistor Tr97 may discharge electrons of the four photoelectric conversion units 171, 172, 173, and 174 at once in response to a reset signal.
If it is selected to use the pixel 170 as a phase difference detecting pixel according to an embodiment, the two photoelectric conversion units 172 and 174 may be simultaneously controlled by controlling electron control units 175 and 176 based on a control signal from a phase difference detecting pixel control line PX(186). For example, in case of detecting a horizontal phase difference, the control units 175 and 176 are turned on, such that the pixel 170 functions as a phase difference detecting pixel for detecting a horizontal phase difference. Furthermore, the two photoelectric conversion units 173 and 174 may be simultaneously controlled by controlling the electron control units 177 and 178 based on a control signal from the other phase difference detecting pixel control line PY(179). In this case, in order to detect a vertical phase difference, the control units 177 and 178 may be turned on, such that the pixel 170 functions as a phase difference detecting pixel for detecting a vertical phase difference.
However, the embodiment shown in
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
Referring to
If the selected AF region is not multi AF region in the operation S101 (e.g., it is selected to perform AF at an AF region selected by a user), the process proceeds to an operation S105. In the operation S105, phase difference detecting pixels at the selected AF region are turned on to configure R columns and L columns suitable for detecting phase differences at the selected AF region.
Next, in the operation S104, phase difference detection is performed at the selected AF region, and AF is performed based on a result of the phase difference detection. When focusing is completed, the process proceeds to the operation S106.
In the operation S106, the electronic apparatus 100 waits until a shutter release signal S2 is input (e.g., a fully pressed shutter release button). When the shutter release signal S2 is input, phase difference detecting pixels in the imaging device are switched to an imaging pixel mode. Accordingly, the phase difference detecting pixels are turned off (S108). Since switching of an imaging device to an imaging mode is described above, detailed descriptions thereof will be omitted. When the phase difference detecting pixels are turned off, an image is captured in an operation S108, and thus the sequence is completed.
According to the above embodiments, in an image pixel, phase difference detecting pixels are not defective pixels and may be used as imaging pixels without image quality deterioration in an output image. Furthermore, image quality may not be deteriorated even if the number of phase difference detecting pixels is increased to improve AF efficiency.
All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein.
The particular implementations shown and described herein are illustrative examples of the invention and are not intended to otherwise limit the scope of the invention in any way. For the sake of brevity, conventional electronics, control systems, software development and other functional aspects of the systems (and components of the individual operating components of the systems) may not be described in detail. Furthermore, the connecting lines, or connectors shown in the various figures presented are intended to represent exemplary functional relationships and/or physical or logical couplings between the various elements. It should be noted that many alternative or additional functional relationships, physical connections or logical connections may be present in a practical device. Moreover, no item or component is essential to the practice of the invention unless the element is specifically described as “essential” or “critical”.
The apparatus described herein may comprise a processor, a memory for storing program data to be executed by the processor, a permanent storage such as a disk drive, a communications port for handling communications with external devices, and user interface devices, including a display, touch panel, keys, buttons, etc. When software modules are involved, these software modules may be stored as program instructions or computer readable code executable by the processor on a non-transitory computer-readable media such as magnetic storage media (e.g., magnetic tapes, hard disks, floppy disks), optical recording media (e.g., CD-ROMs, Digital Versatile Discs (DVDs), etc.), and solid state memory (e.g., random-access memory (RAM), read-only memory (ROM), static random-access memory (SRAM), electrically erasable programmable read-only memory (EEPROM), flash memory, thumb drives, etc.). The computer readable recording media may also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion. This computer readable recording media may be read by the computer, stored in the memory, and executed by the processor.
Also, using the disclosure herein, programmers of ordinary skill in the art to which the invention pertains may easily implement functional programs, codes, and code segments for making and using the invention.
The invention may be described in terms of functional block components and various processing steps. Such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions. For example, the invention may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, where the elements of the invention are implemented using software programming or software elements, the invention may be implemented with any programming or scripting language such as C, C++, JAVA®, assembler, or the like, with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements. Functional aspects may be implemented in algorithms that execute on one or more processors. Furthermore, the invention may employ any number of conventional techniques for electronics configuration, signal processing and/or control, data processing and the like. Finally, the steps of all methods described herein may be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context.
The use of the terms “a” and “an” and “the” and similar referents in the context of describing the invention (especially in the context of the following claims) are to be construed to cover both the singular and the plural. Furthermore, recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein, and each separate value is incorporated into the specification as if it were individually recited herein. Finally, the steps of all methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate the invention and does not pose a limitation on the scope of the invention unless otherwise claimed. Numerous modifications and adaptations will be readily apparent to those skilled in this art without departing from the spirit and scope of the invention.
While the invention has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the following claims.
Patent | Priority | Assignee | Title |
10734419, | Oct 31 2018 | TAIWAN SEMICONDUCTOR MANUFACTURING CO , LTD | Imaging device with uniform photosensitive region array |
11462578, | Oct 31 2018 | Taiwan Semiconductor Manufacturing Company, Ltd. | Imaging device with uniform photosensitive region array |
Patent | Priority | Assignee | Title |
6829008, | Aug 20 1998 | Canon Kabushiki Kaisha | Solid-state image sensing apparatus, control method therefor, image sensing apparatus, basic layout of photoelectric conversion cell, and storage medium |
20110141329, | |||
20120133821, | |||
20120287330, | |||
20120327291, | |||
20130088621, | |||
20140008514, | |||
20140307134, | |||
JP2000156823, | |||
JP2001250931, | |||
JP2003156677, | |||
JP2012113171, | |||
JP2012182824, | |||
JP2013029803, | |||
KR1020130011140, | |||
KR1020130038035, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 02 2014 | HAMADA, MASATAKA | SAMSUNG ELECTRONICS CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032912 | /0302 | |
May 16 2014 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jul 18 2016 | ASPN: Payor Number Assigned. |
Jul 19 2019 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Oct 16 2023 | REM: Maintenance Fee Reminder Mailed. |
Apr 01 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 23 2019 | 4 years fee payment window open |
Aug 23 2019 | 6 months grace period start (w surcharge) |
Feb 23 2020 | patent expiry (for year 4) |
Feb 23 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 23 2023 | 8 years fee payment window open |
Aug 23 2023 | 6 months grace period start (w surcharge) |
Feb 23 2024 | patent expiry (for year 8) |
Feb 23 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 23 2027 | 12 years fee payment window open |
Aug 23 2027 | 6 months grace period start (w surcharge) |
Feb 23 2028 | patent expiry (for year 12) |
Feb 23 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |