A time of flight-based system is operable for ambient light measurements. A method of operation includes detecting, in at least one active demodulation detection pixel, a first particular wavelength and generating amplitude data of the first particular wavelength; and detecting, in at least one spurious reflection detection pixel, a second particular wavelength and generating amplitude data of the second particular wavelength. In a computational device that stores spectrum data corresponding respectively to a plurality of different ambient light source types, an ambient lighting condition is determined based on the amplitude data of the first particular wavelength, the amplitude data of the second particular wavelength and the spectrum data of a particular one of the ambient light source types associated with the amplitude data of the first particular wavelength and the amplitude data of the second particular wavelength.
|
10. A method comprising:
detecting, in at least one active demodulation detection pixel, a first particular wavelength and generating amplitude data of the first particular wavelength;
detecting, in at least one spurious reflection detection pixel, a second particular wavelength and generating amplitude data of the second particular wavelength;
identifying, in a computational device that stores spectrum data corresponding respectively to a plurality of different ambient light source types, a particular one of the ambient light source types based on a ratio of the amplitude data of the first particular wavelength and the amplitude data of the second particular wavelength and based on the stored spectrum data.
1. A time-of-flight-based optoelectronic system comprising:
at least one active demodulation detection pixel operable to detect a first particular wavelength, and being further operable to generate amplitude data of the first particular wavelength;
at least one spurious reflection detection pixel operable to detect a second particular wavelength, and being further operable to generate amplitude data of the second particular wavelength;
outputs of the at least one active demodulation detection pixel and the at least one spurious reflection detection pixel being communicatively coupled to a computational device;
the computational device including a computer storage medium storing spectrum data that corresponds, respectively, to a plurality of different ambient light source types, wherein the amplitude data of the first particular wavelength and the amplitude data of the second particular wavelength are associated with spectrum data of a particular one of the ambient light source types,
wherein the computational device is operable to identify the particular one of the ambient light source types based on a ratio of the amplitude data of the first and second particular wavelengths and based on the stored spectrum data.
2. The time-of-flight-based optoelectronic system of
3. The time-of-flight-based optoelectronic system of
4. The time-of-flight-based optoelectronic system of
5. The time-of-flight-based optoelectronic system of
6. The time-of-flight-based optoelectronic system of
7. The time-of-flight-based optoelectronic system of
8. The time-of-flight-based optoelectronic system of
9. The time-of-flight-based optoelectronic system of
a first light source operable to generate modulated electromagnetic radiation of the first particular wavelength; and
a second light source operable to generate modulated electromagnetic radiation of the second particular wavelength.
11. The method of
12. The method of
13. The method of
14. The method of
|
The present application claims the benefit of priority of U.S. Provisional Application No. 62/380,596, filed on Aug. 29, 2016. The entire contents of the earlier application are incorporated herein by reference.
This disclosure relates to time of flight-based systems operable for ambient light and distance or proximity measurements.
Some handheld computing devices such as smart phones can provide a variety of different optical functions such as one-dimensional (1D) or three-dimensional (3D) gesture detection, 3D imaging, time-of-flight or proximity detection, ambient light sensing, and/or front-facing two-dimensional (2D) camera imaging.
Time-of-flight (TOF) sensors, for example, can be used to detect the distance to an object. In general, TOF systems are based on the phase-measurement technique of emitted intensity-modulated light, which is reflected by a scene. The reflected light is imaged onto a sensor, and the photo-generated electrons are demodulated in the sensor. Based on the phase information, the distance to a point in the scene for each pixel can be determined by processing circuitry associated with the sensor.
This disclosure describes TOF-based systems operable for ambient light measurements and, in some cases, distance or proximity measurements.
In one aspect, for example, the disclosure describes a method that includes detecting, in at least one active demodulation detection pixel, a first particular wavelength and generating amplitude data of the first particular wavelength; and detecting, in at least one spurious reflection detection pixel, a second particular wavelength and generating amplitude data of the second particular wavelength. The method further includes determining, in a computational device that stores spectrum data corresponding respectively to a plurality of different ambient light source types, an ambient lighting condition based on the amplitude data of the first particular wavelength, the amplitude data of the second particular wavelength and the spectrum data of a particular one of the ambient light source types associated with the amplitude data of the first particular wavelength and the amplitude data of the second particular wavelength.
In another aspect, the disclosure describes a TOF-based optoelectronic system including at least one active demodulation detection pixel operable to detect a first particular wavelength and to generate amplitude data of the first particular wavelength; and at least one spurious reflection detection pixel operable to detect a second particular wavelength and to generate amplitude data of the second particular wavelength. Outputs of the at least one active demodulation detection pixel and the at least one spurious reflection detection pixel are communicatively coupled to a computational device. The computational device includes a computer storage medium storing spectrum data that corresponds, respectively, to a plurality of different ambient light source types. The amplitude data of the first particular wavelength and the amplitude data of the second particular wavelength are associated with spectrum data of a particular one of the ambient light source types. The computational device is operable to determine an ambient lighting condition based on the amplitude data of the first particular wavelength, the amplitude data of the second particular wavelength and the spectrum data of the particular one of the ambient light source types.
Some implementations include one or more of the following features. For example, the computational device can be operable to modify a component of the time-of-flight-based optoelectronic system based on the determined ambient lighting condition. The component can be, for example, a component of the computational device such as a display screen. In such implementations, the computational device can be operable to modify at least one of brightness or color temperature of the display screen based on the determined ambient lighting condition.
In some cases, the computational device is operable to identify the particular one of the ambient light source types based on a ratio of the amplitude data of the first and second particular wavelengths and based on the spectrum data of the particular one of the ambient light source types.
In some implementations, the at least one active demodulation detection pixel and the at least one spurious reflection detection pixel are operable to generate at least one of distance data or proximity data. In some cases, the time-of-flight-based optoelectronic system includes a first spectral filter disposed such that the first particular wavelength of the electromagnetic system is incident on the at least one active demodulation detection pixel, and further including a second spectral filter disposed such that the second particular wavelength of the electromagnetic system is incident on the at least one spurious reflection detection pixel. For example, the first particular wavelength can be 940 nm and the second particular wavelength can be 660 nm.
Other aspects, features and various advantages will be readily apparent from the following detailed description, the accompanying drawings, and the claims.
A spacer 114 is attached to the first side of the PCB 110 and separates the PCB 110 from an optics member 116. The spacer 114 can be composed of a material (e.g., epoxy resin) and have a thickness such that it is substantially non-transparent to wavelengths of light detectable by the TOF sensor 108. An interior wall 115 of the spacer 114 provides optical isolation between the module's two chambers (i.e., the light emission chamber (channel) 102 and the light detection chamber (channel) 104).
The optics member 116 includes a respective passive optical element (e.g., a lens) 120A, 120B for each channel 102, 104. Light from the emitter 106 is directed out of the module 100 and, if reflected by an object back toward the module's detection channel 104, can be sensed by the TOF sensor 108.
The TOF sensor 108 includes an array of spatially distributed light sensitive elements (e.g., pixels), as well as logic and other electronics to read and process the pixel signals. The pixels can be implemented, for example, in a single integrated semiconductor chip (e.g., a CCD or CMOS sensor). The emitter 106 and the TOF sensor 108 can be connected electrically to the PCB 110, for example, by conductive pads or wire bonds. The PCB 110, in turn, can be connected electrically to other components within a host device (e.g., a smart phone). The TOF sensor 108 is operable to resolve distance based on the known speed of light by measuring the time-of-flight of a light signal between the sensor and the subject for each point of an object. The circuitry in the TOF sensor 108 can use signals from the pixels to calculate, for example, the time the light has taken to travel from the emitter to an object of interest and back to the focal plane array.
The TOF sensor 108 can be implemented, for example, as an integrated sensor chip. As shown in
The sensor's processing circuitry can be implemented, for example, as one or more integrated circuits in one or more semiconductor chips with appropriate digital logic and/or other hardware components (e.g., read-out registers; amplifiers; analog-to-digital converters; clock drivers; timing logic; signal processing circuitry; and/or a microprocessor). The processing circuitry may reside in the same semiconductor chip as the sensor 108 or in one or more other semiconductor chips.
In the example of
In some of the examples described here, it is assumed that spurious reflections may be caused by a smudge on the cover glass of the host device. However, the modules and techniques described below also can be applicable to spurious reflections resulting from other direct reflections such as from the cover glass, from a filter, or from other optical/non-optical components in the optoelectronic module or host device.
In some cases, the spurious-reflection detection pixel(s) 126 is positioned relative to the demodulation detection pixels 124 such that, in the absence of a smudge on cover 132 of the host device, the spurious-reflection detection pixel 126 senses, at most, a signal representing only a relatively low optical intensity of light reflected by an object in a scene outside the module 100. In contrast, when a smudge 130 is present on the surface of the cover 132 of the host device, the smudge may redirect some of the light reflected by the external object toward the spurious-reflection detection pixel(s) 126 such that they sense a significantly higher optical intensity. For example, the spurious-reflection detection pixel 126 can be positioned on the sensor 108 a sufficient lateral distance (d) from the demodulation detection pixels 124 such that, in the absence of a smudge on cover 132 of the host device, the spurious-reflection detection pixel 126 senses, at most, only a relatively low optical intensity of light reflected by an object in a scene outside the module 100. On the other hand, a smudge 130 on the surface of the cover 132 of the host device can cause some of the light reflected by the external object to be redirected toward the spurious-reflection detection pixel 126 such that it senses a significantly higher optical intensity.
The optical intensity sensed by the spurious-reflection detection pixel 126 can be used by the sensor's processing circuitry to determine whether a smudge is present on the cover glass 132 and to determine how much light (i.e., amplitude and phase) collected by the active pixels 124 results from the smudge rather than the object of interest. For example, as illustrated by
In some implementations, the module includes pixels that serve as combined reference and spurious-reflection detection pixels. An example is illustrated in
In some implementations, instead of, or in addition to, dedicated spurious-reflection detection (i.e., smudge) pixels, signals obtained from the demodulation detection pixels 124 can be used to determine the wave component (i.e., amplitude, phase) that is caused by reflection from a smudge 130 on the surface of the cover glass 132. To do so, the wave component caused by the smudge reflection can be estimated, for example, by repeating measurements at two different modulation frequencies. Assuming the distance between the smudge 130 and the emitter 106 is known to the module's processing circuity (e.g., based on a previously stored value in memory and/or calibration of the module), the additional wave component resulting from the presence of the smudge 130 can be determined by the processing circuity. Any such additional wave component would be common to signals detected by the demodulation detection pixels 124 at both modulation frequencies. The additional wave component caused by the smudge 130 can be eliminated (i.e., subtracted out) through known vector manipulation techniques, and the wave components resulting from light reflected by the object of interest outside the module can be calculated. The resulting phase shift then can be used to calculate the distance to the object 135.
TOF-based systems such as those described above can be used to determine ambient light conditions. In some implementations, the TOF-based system 100 is operable to detect a particular wavelength (e.g., 940 nm) of the electromagnetic spectrum via the active the demodulation detection pixels 124 and the one or more dedicated spurious reflection detection pixels 126. In other implementations, the active demodulation detection pixels 124 are operable to detect a first particular wavelength (e.g., 940 nm) and the one or more dedicated spurious reflection detection pixels 126 are operable to detect a second different particular wavelength of the electromagnetic spectrum (e.g., 660 nm). For example, a first spectral filter can be disposed such that the first wavelength is incident on the active demodulation detection pixel(s) 124, and a second spectral filter can be disposed such that the second wavelength is incident on the spurious reflection detection pixel(s). The respective amplitudes of the particular wavelengths then can be determined, for example, as described above.
As shown in
In some implementations, such as implementations in which the active demodulation detection pixels 124 and the one or more dedicated spurious reflection detection pixels 126 respectively detect different particular wavelengths of light, the amplitude of both wavelengths (e.g., their ratio) can be used by the processing circuitry 400 to deduce the particular ambient light source type (see FIG. &, block 600) even without user input specifying the ambient light source type. The processing circuitry 404 then compares the amplitudes (e.g., ratio) of the particular wavelengths of light to the specified spectrum data (block 602) and deduces the ambient lighting conditions (e.g., intensity and color temperature) based on the stored solar spectrum corresponding to the particular ambient light source type and from the amplitudes of the particular wavelengths (block 604).
The deduced ambient lighting conditions can be used to modify various components of the computational device (see
In some implementations, the TOF-based systems described above also are operable to determine proximity or distance to objects. For example, the TOF-based system 100 can be operable to detect light reflected from objects via active demodulation detection pixels 124, and the amplitude of the reflected light can be determined as described above. In some instances, the amplitude corresponds to the proximity of the object. For example, the amplitude may be relatively large when the object is in close proximity and may be smaller when the object is further away. The processing circuitry 404 of the computational device 400 can use the distance and/or proximity information to modify various components of the computational device in conjunction with the deduced ambient lighting conditions. For example, in some instances the distance and/or proximity data can be used to determine whether an object is in close proximity to the computational device. In such situations, amplitude data collected by the active demodulation detection pixels 124 and/or the one or more dedicated spurious reflection detection pixels 126 need not be used to deduce ambient lighting conditions since the data may not accurately reflect ambient lighting conditions; accordingly, computational resources and power may be conserved.
Various aspects of the subject matter and the functional operations described in this specification can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Embodiments of the subject matter described in this specification can be implemented as one or more computer program products, i.e., one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus. The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter effecting a machine-readable propagated signal, or a combination of one or more of them. The terms “data processing apparatus” and “computer” encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers. The apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
A computer program (also known as a program, software, software application, script, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a markup language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
The processes and logic flows described in this specification can be performed by one or more programmable processors executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus can also be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit).
Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a smartphone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non-volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
A number of implementations have been described. Nevertheless, various modifications may be made without departing from the spirit of the invention. Accordingly, other implementations are within the scope of the claims.
Buettgen, Bernhard, Rossi, Markus, Geiger, Jens, Kiy, Michael, Chidley, Oliver
Patent | Priority | Assignee | Title |
10608135, | Jan 31 2018 | LITE-ON SINGAPORE PTE. LTD. | Wafer level sensing module |
Patent | Priority | Assignee | Title |
5591945, | Apr 19 1995 | ELO TOUCH SOLUTIONS, INC | Acoustic touch position sensor using higher order horizontally polarized shear wave propagation |
20110043806, | |||
20150340351, | |||
20150374245, | |||
20160073954, | |||
20180088681, | |||
WO2015136100, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 29 2017 | ams Sensors Singapore Pte. Ltd. | (assignment on the face of the patent) | / | |||
Nov 11 2017 | BUETTGEN, BERNHARD | HEPTAGON MICRO OPTICS PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048128 | /0975 | |
Nov 13 2017 | CHIDLEY, OLIVER | HEPTAGON MICRO OPTICS PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048128 | /0975 | |
Nov 14 2017 | KIY, MICHAEL | HEPTAGON MICRO OPTICS PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048128 | /0975 | |
Dec 19 2017 | GEIGER, JENS | HEPTAGON MICRO OPTICS PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048128 | /0975 | |
Feb 05 2018 | HEPTAGON MICRO OPTICS PTE LTD | AMS SENSORS SINGAPORE PTE LTD | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 049222 | /0062 | |
Jan 18 2019 | ROSSI, MARKUS | HEPTAGON MICRO OPTICS PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048128 | /0975 |
Date | Maintenance Fee Events |
Aug 29 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Nov 16 2022 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
May 21 2022 | 4 years fee payment window open |
Nov 21 2022 | 6 months grace period start (w surcharge) |
May 21 2023 | patent expiry (for year 4) |
May 21 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 21 2026 | 8 years fee payment window open |
Nov 21 2026 | 6 months grace period start (w surcharge) |
May 21 2027 | patent expiry (for year 8) |
May 21 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 21 2030 | 12 years fee payment window open |
Nov 21 2030 | 6 months grace period start (w surcharge) |
May 21 2031 | patent expiry (for year 12) |
May 21 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |