Techniques of designing a sensing system for pseudo 3d mapping in robotic applications are described. According to one aspect of the present invention, an image system is designed to include at least two linear sensors, where these two linear sensors are positioned or disposed orthogonally. In one embodiment, the two linear sensors are a horizontal sensor and a vertical sensor. The horizontal sensor is used for the lidar application while the vertical sensor is provided to take videos, namely scanning the environment wherever the horizontal sensor misses. As a result, the videos can be analyzed to detect anything below or above a blind height in conjunction with the detected distance by the lidar.
|
16. A method for pseudo 3d mapping, the method comprising:
measuring a distance towards a target by a first linear sensor as part of a lidar system, wherein the distance is measured at a predefined height from a ground;
scanning and generating a video of an environment by a second linear sensor, wherein the first linear sensor and the second linear sensor are in a predefined rigid relationship, image data from the second linear sensor is incorporated with the distance obtained from the first linear sensor, both of the first and second linear sensors are mounted on a robot moving in the environment, an image of the environment is generated from the second linear sensor synchronized with respective distances obtained from the first sensor while the robot is moving in the environment.
1. A sensing system for pseudo 3d mapping, the sensing system comprising:
at least a first linear sensor and a second linear sensor;
the first linear sensor, disposed horizontally, provided to measure a distance towards a target, wherein the distance is measured at a predefined height from a ground; and
the second linear sensor, disposed vertically, provided to take a video of an environment, wherein the first linear sensor and the second linear sensor are in a predefined rigid relationship, image data from the second linear sensor is incorporated with the distance obtained from the first linear sensor, the sensing system is mounted on a robot moving in the environment, an image of the environment is generated from the second linear sensor synchronized with respective distances obtained from the first linear sensor while the robot is moving in the environment.
2. The sensing system as recited in
3. The sensing system as recited in
4. The sensing system as recited in
5. The sensing system as recited in
6. The sensing system as recited in
7. The sensing system as recited in
a linear array of pixels, each of the pixels including one photosensor and producing a final signal within a predefined frame, wherein the final signal captures a reflected light without background light interference;
a readout circuit including at least a first storage device and a second storage device; and
a clock circuit, coupled to the sensor and the readout circuit, causing the readout circuit to store a first charge and a second charge on the first storage device and the second storage, respectively, wherein the first charge or the second charge is proportional to an intensity of an incoming light impinged upon the photosensor, the first charge and the second charge are created successively within the predefined frame, and the final signal is a subtraction between the first charge and the second charge.
8. The sensing system as recited in
9. The sensing system as recited in
10. The sensing system as recited in
11. The sensing system as recited in
12. The sensing system as recited in
13. The sensing system as recited in
14. The sensing system as recited in
15. The sensing system as recited in
17. The method for pseudo 3d mapping as recited in
18. The method for pseudo 3d mapping as recited in
|
|||||||||||||||||||||||||||||||
This application is a continuation-in-part of co-pending U.S. patent application Ser. No. 15/981,926, entitled “Sensor, apparatus for lidar application and method of background cancellation”, filed on May 17, 2018.
The present invention is related to the area of robotic vision systems. More particularly, the present invention is related to a sensing system including at least two perpendicular image sensors provided to sense surrounding of a robot to generate a pseudo 3D model and method of using the same.
Mobile robotics is one of the fastest expanding fields of scientific research nowadays. With proper additional mechanisms, mobile robots can substitute humans in many applications, such as surveillance, patrolling, industrial automation, and construction. A robot is autonomous when the robot itself has the ability to determine, the actions to be taken to perform a task, using a perception system that helps it. Lidar (also called LIDAR, LiDAR, or LIDAR) is a common solution in many applications such as robotic vacuuming. It is a surveying method that measures distance to a target by illuminating the target with dot pulsed laser light and measuring the reflected pulses with a sensor. Differences in locations (or response) of the dot laser returns on the sensor are then be used to make digital 3-D representations of the target. The name lidar, now used as an acronym of light detection and ranging (sometimes light imaging, detection, and ranging), was originally a portmanteau of light and radar. Lidar sometimes is also called laser scanning and laser depth sensor scanning with terrestrial, airborne, and mobile applications.
Lidar uses ultraviolet, visible, or near infrared light to image objects. It can target a wide range of materials.
This section is for the purpose of summarizing some aspects of the present invention and to briefly introduce some preferred embodiments. Simplifications or omissions in this section as well as in the abstract or the title of this description may be made to avoid obscuring the purpose of this section, the abstract and the title. Such simplifications or omissions are not intended to limit the scope of the present invention.
In general, the present invention pertains to designs of image sensors and its practical uses. According to one aspect of the present invention, an image system is designed to include at least two linear sensors, where these two linear sensors are positioned or disposed orthogonally. In one embodiment, the two linear sensors are a horizontal sensor and a vertical sensor. The horizontal sensor is used for the lidar application while the vertical sensor is provided to take videos, namely scanning the environment wherever the horizontal sensor misses. As a result, the videos can be analyzed to detect anything below or above a blind height in conjunction with the detected distance by the lidar.
The linear sensors take advantage of the architecture of CMOS sensor with correlated double sampling, or CDS, to avoid the sensing speed being halved. It is commonly known that a photosensor is read twice (i.e., first and second readouts) in CDS for removing the inherent noises from the photosensor itself. Instead of subtracting a pixel's dark or reference output level from an actual light-induced signal, a background image is managed to be captured before the second readout of the sensor and subtracted from an actual image, where the actual image is assumed to include a target. As a result, the readout speed of an image sensor is maintained while the background light interference is removed.
According to another aspect of the present invention, a 2D sensor is operated to work as multiple line sensors when the 2D sensor is used. Color filters may be added to allow the vertical sensors to generate color images. Depending on implementation, color filters may be based on a set of red, green and blue (RGB) filters or a set of RGB filters with other filter(s) (e.g., infrared or UV light).
According to still another aspect of the present invention, there are two independent storage devices (e.g., capacitors) in the photosensor, each provided to store a charge from an exposure. According to yet another aspect of the present invention, a clock signal circuit is provided to control the first and second readouts of a photosensor sensor. Clock signals are designed to ensure two independent exposures take place successively within one image frame. The two readouts stored in the capacitors from the two independent successive exposures are available from which a final signal is obtained.
The present invention may be implemented in various ways including a method, an apparatus or a system. According to one embodiment, the present invention is a sensing system comprising: at least a first linear sensor and a second linear sensor, the first linear sensor disposed horizontally provided to measure a distance towards a target as part of a lidar system, wherein the distance is measured at a predefined height from a ground; and the second linear sensor, disposed vertically, provided to take a video of an environment. The first linear sensor and the second linear sensor are in a predefined rigid relationship, image data from the second linear sensor is incorporated with the distance obtained from the first linear sensor. The sensing system is mounted on a robot moving in the environment, an image of the environment is generated from the second linear sensor synchronized with respective distances obtained from the first sensor while the robot is moving in the environment.
According to another embodiment, the present invention is a method for pseudo 3D mapping, the method comprises: measuring a distance towards a target by a first linear sensor as part of a lidar system, wherein the distance is measured at a predefined height from a ground, and scanning and generating a video of an environment by a second linear sensor. The first linear sensor and the second linear sensor are in a predefined rigid relationship. Image data from the second linear sensor is incorporated with the distance obtained from the first linear sensor, both of the first and second linear sensors are mounted on a robot moving in the environment. An image of the environment is generated from the second linear sensor synchronized with respective distances obtained from the first sensor while the robot is moving in the environment.
Different objects, features, and advantages of the present invention will become apparent upon examining the following detailed description of an embodiment thereof, taken in conjunction with the attached drawings.
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
The detailed description of the present invention is presented largely in terms of procedures, steps, logic blocks, processing, or other symbolic representations that directly or indirectly resemble the operations of devices or systems contemplated in the present invention. These descriptions and representations are typically used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
Reference herein to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the invention. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
Embodiments of the invention are discussed below with reference to
An active-pixel sensor (APS) is an image sensor includes an integrated circuit containing an array of pixel sensors, each pixel containing a photodetector and an active amplifier. There are many types of active pixel sensors including the CMOS APS. Such an image sensor is produced by a CMOS process (and is hence also known as a CMOS sensor), and has emerged as an alternative to charge-coupled device (CCD) image sensors. Depending on an application, the image sensor may be implemented as a linear sensor or an area sensor. To facilitate the description of the present invention, the description herein is largely based on a linear array of photosensors unless explicitly stated.
A vertical sensor 216, on either one side of the horizontal sensor 212, is disposed orthogonally to and maintains a predefined relationship with the horizontal sensor 212. It captures a line of video on the target in front of it.
Referring now to
It is assumed that there are 500 photosensors or elements on the sensor 302. The vertical sensor 302 captures a line of scene while the horizontal sensor 304 detects a distance at a predefined height (e.g., a blind height of
The amplifier 412 is an amplifier provided to amplify the signal produced by the photodiode 416. As an example shown in
The correlated double sampling, or CDS, circuitry is a method employed to improve the signal to noise ratio (S/N) of an image sensor by reading out the pixel 410 twice. The first readout happens right after the exposure of the sensor to a scene. The second readout happens without the sensor is exposed to the scene but soon after the first readout has successfully occurred. Accordingly, the first readout is herein referred to as actual light-induced signal while the second readout is referred to as a reference signal. The reference signal is largely coming from internal dark or reference output level in the pixel. By subtracting the reference output signal from the actual light-induced signal, static fixed pattern noise (FPN) and several types of temporal noise are effectively removed from the output of the sensor. In operation, the first readout of the signal from the photosensor 416 is stored on a capacitor 418 and the second readout the signal from the photosensor 416 is stored on a capacitor 420. The final readout of the signal is the difference between the signals on the capacitors 418 and 420.
Referring now to
The signals set 522 shows the clock signal 528 is modified or redesigned by including a pulse 529 (e.g., the width of pulse may be adjusted to substantially that of the exposure pulse 544) to disable the second readout from the photodiode. Instead, the pulse 529 causes the sensor to be immediately exposed to the scene with the light source turned off. The resultant readout from the sensor is shown as 540 and includes the signal 542 of the reflected light dot from the emitted light (e.g., visible or invisible laser or infrared) from a light source disposed next to the sensor in a predefined configuration.
Referring now to
In operation, there are essentially two successive exposures with the photodiode 602. In one embodiment, the first exposure is a scene with a light source turned off. The charge on the capacitor 606 pertains to a background. If there are ambient lights in the background, the charge on the capacitor 606 would capture the ambient lights. The second exposure is a scene with a light source turned on. In other words, an object is being illuminated by a projected light from a light source with a known configuration with respect to the image sensor (e.g., the photodiode 602). The charge on the capacitor 608 pertains to the background as well as the reflection of the light on the object. An adder 610 is provided to perform the subtraction between the two charges on the two different capacitors 606 and 608, namely subtracting the background from the second charge. As a result, the final signal presents a clean reflection of the projected light.
According to one embodiment, the present invention may be realized by a clock signal circuit to control an existing CMOS image sensor, where the clock signal circuit generates one or more signals, at least one of the signals is used to modify the clock signal 528. The modified signal is shown correspondingly in the signal set 522. One of the advantages, benefits and objectives in this implementation is to take advantages of the existing CMOS sensors to realize one embodiment of the present invention.
According to another embodiment, a circuit for generating the clock signals 714 may be simply modified to generate the corresponding clock signal in the signals set 522 to effectively control the operation of the sensor 706, resulting in two successive exposures to cancel the background light interference, wherein charges from the two successive exposures are retained within a pixel before a final readout is obtained from the pixel.
The present invention has been described in sufficient detail with a certain degree of particularity. It is understood to those skilled in the art that the present disclosure of embodiments has been made by way of examples only and that numerous changes in the arrangement and combination of parts may be resorted without departing from the spirit and scope of the invention as claimed. While the embodiments discussed herein may appear to include some limitations as to the presentation of the information units, in terms of the format and arrangement, the invention has applicability well beyond such embodiment, which can be appreciated by those skilled in the art. Accordingly, the scope of the present invention is defined by the appended claims rather than the forgoing description of embodiments.
| Patent | Priority | Assignee | Title |
| Patent | Priority | Assignee | Title |
| 9392259, | Dec 23 2010 | Fastree3D SA | 2D/3D real-time imager and corresponding imaging methods |
| 20150002629, | |||
| 20150339826, | |||
| 20190323845, | |||
| 20210089040, | |||
| 20220299650, |
| Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
| Nov 17 2020 | WANG, WENG LYANG | CMOS SENSOR, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 054438 | /0694 | |
| Nov 17 2020 | WEI, HUI | CMOS SENSOR, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 054438 | /0694 | |
| Nov 18 2020 | CMOS Sensor, Inc. | (assignment on the face of the patent) | / |
| Date | Maintenance Fee Events |
| Nov 18 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
| Nov 30 2020 | SMAL: Entity status set to Small. |
| Date | Maintenance Schedule |
| Apr 16 2027 | 4 years fee payment window open |
| Oct 16 2027 | 6 months grace period start (w surcharge) |
| Apr 16 2028 | patent expiry (for year 4) |
| Apr 16 2030 | 2 years to revive unintentionally abandoned end. (for year 4) |
| Apr 16 2031 | 8 years fee payment window open |
| Oct 16 2031 | 6 months grace period start (w surcharge) |
| Apr 16 2032 | patent expiry (for year 8) |
| Apr 16 2034 | 2 years to revive unintentionally abandoned end. (for year 8) |
| Apr 16 2035 | 12 years fee payment window open |
| Oct 16 2035 | 6 months grace period start (w surcharge) |
| Apr 16 2036 | patent expiry (for year 12) |
| Apr 16 2038 | 2 years to revive unintentionally abandoned end. (for year 12) |