A three dimensional imaging device is presented which uses a single pulse from a pulsed light source to detect objects which are obscured by camouflage, fog or smoke but otherwise enveloped by a light-transmitting medium. The device simultaneously operates in two modes, light reflected from the nearest object is processed to form a three-dimensional image by an array of pixels. This first image is based upon the light-pulse transit time recorded in each pixel. Each pixel also contains a high-speed analog memory that sequentially stores reflected signals at a repeated time interval. The first reflection acts as a time base that controls when the analog memory begins or ends the storage sequence. The first return could be from a camouflage net and the amplitudes of the return signals, after the first return, would then be from objects behind the net. computer processing these amplitudes reveals the three-dimensional nature of the obscured objects.
The device consists of the pulsed light source, optics for collecting the reflected light, a sensor for detecting the light and converting it to electrical data, drive and output electronics for timing and signal conditioning of data generated by the sensors and a computer for processing the sensor data and converting it to a three dimensional image. The sensor collects and processes the light data in a unique manner, first converting it to electricity by a number of alternate detector technologies and then using integrated circuit chips which consist of a two dimensional array of electronic pixels also called unit cells. The two dimensional array defines two dimensions of the image. Stored within each unit cells is data associated with the third dimension, ranges of targets, and amplitudes of target reflections. This data is read out of the integrated circuit chip in the time interval between laser pulses to a processing computer. The processing computer corrects the data and, by means of computer algorithms specific to the device, converts the data to a three-dimensional image of one or more targets. This image may be viewed or processed electronically to isolate targets.
|
91. A method for imaging one or more three dimensional objects obscured by reflective or absorptive material but otherwise immersed in a light conducting medium comprising the steps of:
generating a series of pulses of light; transmitting said light into said medium to the source of obscuration; collecting light from said source of obscuration during the time of transmission and reflection of light from said source of obsuration; detecting said collected light providing timing control from said detected light providing electrical signals from a plurality of positions on the objects beyond the source of obscuration with a single light pulse, storing said electrical signals on a plurality of unit cells corresponding to the said plurality of positions on said objects, providing signals from said storage means, converting the signals stored on said storage means to a three-dimensional image of the objects.
46. A sensor means for detecting collected light, said sensor means comprising
means for converting collected light into electrical charge, chip means comprising multiplexing and chip output electronics, a plurality of collection or detection means for collecting or detecting the electrical charge from said conversion means, a plurality of unit cell processing electronics including memory units for storing data related to a first return transit time for a reflected laser pulse from a target pixel, and additional memory units for storing data related to the amplitude of laser pulse reflections from one or plurality of targets, including control circuitry by which the sampling time intervals of said memory units for storing data related to the amplitude of laser pulse reflections are independently controlled, from unit cell to unit cell and including output amplifier electronics adapted to provide signals to said multiplexing and chip output electronics, drive electronics for providing voltages and for providing timing for said unit cell processing electronics, output amplifier electronics and said multiplexing and chip output electronics, and output electronics for conditioning the signals from said memory units for data processing; a computer for processing data from said output electronics. 1. A device for imaging one or more three dimensional objects immersed in a light conducting medium comprising:
a pulsed light source; means for transmitting light from said pulsed light source into said medium; optics for collecting light from said medium during the time for light to transit from said pulsed light source, reflect from said objects and be collected by said optics; a sensor means for detecting said collected light, said sensor means comprising means for converting said collected light into electrical charge, chip means comprising multiplexing and chip output electronics, a plurality of collection or detection means for collecting or detecting the electrical charge from said conversion means, a plurality of unit cell processing electronics including memory units for storing data related to a first return transit time for a reflected laser pulse from a target pixel, and additional memory units for storing data related to the amplitude of laser pulse reflections from one or plurality of targets, including control circuitry by which the sampling time intervals of said memory units for storing data related to the amplitude of laser pulse reflections is independently controlled, from unit cell to unit cell and including output amplifier electronics adapted to provide signals to said multiplexing and chip output electronics, drive electronics for providing voltages and for providing timing for said unit cell processing electronics, output amplifier electronics and said multiplexing and chip output electronics, and output electronics for conditioning the signals from said memory units for data processing; a computer for processing data from said output electronics. 2. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
3. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
4. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
5. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
6. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
7. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
8. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
9. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
10. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
11. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
12. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
13. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
14. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
15. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
16. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
17. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
18. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
19. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
20. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
21. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
22. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
23. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
24. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
25. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
26. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
27. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
28. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
29. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
30. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
31. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
32. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
33. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
34. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
35. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
36. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
37. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
38. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
39. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
40. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
41. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
42. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
43. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
44. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
45. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
47. The sensor means for detecting collected light of
48. The sensor means for detecting collected light of
49. The device detecting collected light of
50. The sensor means for imaging one or more three dimensional objects immersed in a light conduction medium of
51. The sensor means for detecting collected light of
52. The sensor means for detecting collected light of
53. The sensor means for detecting collected light of
54. The sensor means for detecting collected light of
55. The sensor means for detecting collected light of
56. The sensor means for detecting collected light of
57. The sensor means for detecting collected light of
58. The sensor means for detecting collected light of
59. The sensor means for detecting collected light of
60. The sensor means for detecting collected light of
61. The sensor means for detecting collected light of
62. The sensor means for detecting collected light of
63. The sensor means for detecting collected light of
64. The sensor means for detecting collected light of
65. The sensor means for detecting collected light of
66. The sensor means for detecting collected light of
67. The sensor means for detecting collected light of
68. The sensor means for detecting collected light of
69. The sensor means for detecting collected light of
70. The sensor means for detecting collected light of
71. The sensor means for detecting collected light of
72. The sensor means for detecting collected light of
73. The sensor means for detecting collected light of
74. The sensor means for detecting collected light of
75. The sensor means for detecting collected light of
76. The sensor means for detecting collected light of
77. The sensor means for detecting collected light of
78. The sensor means for detecting collected light of
79. The sensor means for detecting collected light of
80. The sensor means for detecting collected light of
81. The sensor means for detecting collected light of
82. The sensor means for detecting collected light of
83. The sensor means for detecting collected light of
84. The sensor means for detecting collected light of
85. The sensor means for detecting collected light of
86. The sensor means for detecting collected light of
87. The sensor means for detecting collected light of
88. The sensor means for detecting collected light of
89. The sensor means for detecting collected light of
90. The sensor means for detecting collected light of
92. A method for imaging one or more three dimensional objects obscured by reflective or absorptive material but otherwise immersed in a light conducting medium of
93. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
94. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
95. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
96. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
97. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
98. The sensor means for detecting collected light of
99. The sensor means for detecting collected light of
100. The sensor means for detecting collected light of
101. The sensor means for detecting collected light of
102. The sensor means for detecting collected light of
103. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
104. The device for imaging one or more three dimensional objects immersed in a light conduction medium of
105. The sensor means for detecting collected light of
106. The sensor means for detecting collected light of
|
This invention relates to a laser radar vision apparatus capable of producing three-dimensional images of distant targets located behind reflective or absorbing but penetrable barriers such as camouflage and obscuring smoke. In particular, this invention relates to a multiple pixel, electronic apparatus for capturing three-dimensional images of distant targets, within obscurants, at high-spatial and high-range resolution in the atmosphere or in space with a single laser pulse, using a laser-reflection generated trigger.
This application is a continuation-in-part of U.S. patent application Ser. No. 08/665,738, Filed Jun. 19, 1995, 3D Now U.S. Pat. No. 6,133,989 which is a CIP of Ser. No. 08/015,623, Now U.S. Pat. No. 5,446,529 Imaging Laser Radar. Laser radars (ladars) determine range in the atmosphere by measuring the transit time of a laser pulse from the transmitter/receiver to a partially or fully reflective target and dividing by twice the velocity of light in the atmospheric medium. If there are more than one return pulse, only the first return pulse is used in the range processing. Range resolution in such devices is related to the accuracy of the transit time measurement. In the atmosphere, ranges are typically measured in kilometers, where range resolution can be smaller than 30 cm. A 3-D target image can be obtained with a conventional laser radar by rastering the laser beam across the target and measuring the transit time, pulse by pulse, where each pulse corresponds to a point on the target. The distance between points on the target determines the spatial resolution of the rastered image and defines the picture element (pixel) size; the number of pixels at the target determines the pixel-array size; the range resolution determines resolution in the third target dimension. Rastering is a slow process, particularly for large pixel-array sizes, and it requires cumbersome mechanical scanners and complex pixel-registration computer processing. In addition, if the first laser-pulse return is from a partially reflective obscurant, which is hiding the target, then the 3-D image does not reveal the nature of the real target.
U.S. patent application Ser. No. 08/665,738, Filed Jun. 19, 1995, by the present inventors disclosed a lightweight, small-size, multiplexing laser radar receiver, the LR-FPA, that could image an entire target, in the atmosphere or in space, at high-spatial and high-range resolution with a single laser pulse. Thus the necessity of laser rastering or using a multitude of laser pulses to obtain a three-dimensional image is avoided. The reflected laser pulse, from different portions of an object, stop independent clocks located in a two-dimensional array of pixels. The times at which the clocks are stopped are related to the third dimension of the object by the velocity of light and are stored in the pixels along with peak signal data. The time data and peak signal data is read out from the array between laser pulses and used to construct the three-dimensional image. Processing the peak signal amplitude with the time data increases the range resolution accuracy. More than one reflected pulse for each pixel is accommodated by separately storing the return time and peak signal of each reflection.
U.S. Pat. No. 5,446,529, issued Aug. 29, 1995 to the present inventors discloses a lightweight, multiplexing laser radar receiver (3DI-UDAR) that can generate a three-dimensional image of an entire object, in a light conducting medium, such as water or the atmosphere, with a single laser pulse. The imaging is accomplished by integrating and storing the reflected signals from a multitude of range resolution intervals (range bins), independently for each of a two-dimensional array of pixels; each range bin across the two-dimensional array corresponds to a range slice in three dimensions. After reading the range bin data out between laser pulses, the time of laser pulse returned from the object is determined for each range bin in the two dimensional array, by means of the integration clock and the start of integration. The three-dimensional image is constructed by the knowledge of the return time of each two-dimensional slice. Because there is return pulse amplitude information as a function of time rather than just the peak of the return pulse, more information can be derived concerning the character of the target. The first range bin begins storing information in response to a signal from the invention's drive electronics rather than from an external signal such as the first reflection from the light conducting medium (the surface of the water for example).
There is only a finite storage capacity for each of the pixels (typically 30 to 200 storage bins) in the 3DI-UDAR. For high spatial resolution in a medium that does not attenuate the light appreciably, the effective depth from which the information is coming from is only a small proportion of the entire range. For example, for 30 cm range resolution, 200 storage bins may only correspond to a depth of 60 m whereas the absolute range or the ladar may be many tens of kilometers. Turning on the range bin integration at the optimum range position (the target position) could involve a trial and error process requiring more than one laser pulse or another system which first finds the time delay to the target and then transfers that time delay to the drive electronics.
In the present invention a reflected laser pulse from one or more targets, or from an obscured target are integrated and stored in a sequence of range bins, independently for each of a two-dimensional array of pixels. The range bin integrations are turned on automatically, including a possible programmed delay, as the first reflection arrives at the receiver of the invention. Alternatively the integrations are occurring continuously, with the storage bins being filled by new data until the first reflection arrives. Storage then proceeds until a predetermined but adjustable number of storage bins are filled allowing the option of obtaining data prior to the first reflection and/or after it to be stored. By processing this data, preferably between laser pulses, one or more three-dimensional images of single or multiple targets or targets within obscurants can be generated.
A preferred embodiment of the sensor of the invention is a hybrid of two chips, an array of detectors (or collectors) directly connected to an array of processing-electronics unit cells. Each detector/collector, on one chip, is connected to its own processing-electronics unit cell, on the other chip, and defines a pixel in the image. Each processing-electronics unit cell, on the array, contains an identical and unique integrated circuit which can store the first reflected-pulse transit time, the first reflected peak amplitude and/or a sequence of range bins which contain amplitude information about the first reflected pulse and subsequent reflected pulses or amplitude information only about subsequent reflected pulses. Transit-time and pulse amplitude information for all pixels is read out preferably between laser pulses. Laser photons interact directly with the detector array generating a signal or laser photons are converted to electrons that then generate the signal in different embodiments of the invention.
It is the object of the present invention to provide a device for three dimensional imaging of obscured or unobscured objects using a single laser pulse, in transparent or semi-transparent media by a sequence of measurements on the returned pulse amplitude and to overcome the problems of prior systems associated with the need for first determining the range of the target. The device comprises a pulsed light source, means for projecting the light towards the object, optics for collecting the reflected light, improved sensors for detecting the reflected light, drive and output electronics for timing and signal conditioning of data from the sensors and a computer and software for converting the sensor data to a three dimensional image.
It is further the object of the present invention to provide improved sensors which detect and store laser-pulse, target-reflected, transit-time and/or target-reflected laser pulse amplitude information on a processing-electronics unit cell array from which the ranges and 3-D shape of multiple targets or obscured targets can be accurately determined.
Other objects of the invention will be disclosed in the detailed description or the preferred embodiments that follow and from the claims at the end of this disclosure.
A preferred embodiment of the present invention, the Penetrating 3-D Ladar (PDAR) imaging system depicted in
For intense photon sources the detector array chip 6 may be combined with the laser radar processor chip 8 in a monolithic design. In this configuration a small detector 7 would be incorporated in the laser radar processor processing-electronics unit cell 10 and no metal bumps would be necessary.
In an alternate design the anodes 23 are fabricated as a separate chip, an anode array chip. Metal pads on top of a substrate, typically ceramic, comprise the anode array chip; the anodes 23 collect electrons and pass the electrical signals through metal vias in the ceramic substrate to conducting bumps 9 at the underside of the anode array chip. Contact to the conducting bump 9 is made at the metal via on the bottom of the anode array chip substrate.
The vacuum tube may contains a transparent window 16 to transfer the laser light to a photocathode 17, where the laser light is converted to photoelectrons. The window 16 may be optical glass, a fiber optics plate or sapphire. A voltage between the photocathode and the electron amplifier 24 generates an electric field, E1, in the tube to accelerate photoelectrons into the electron amplifier 24. A voltage between the electron amplifier 24 and the anode array generates an electric field, E2, in the tube to accelerate electrons amplified in the electron amplifier 24 into the anodes 23 of the anode array. The laser radar processor chip 8 is mounted on a ceramic chip carrier 18 which contains wire bond pads 19 that communicate with the drive and output electronics 4 (
In an alternate design the electron amplifier 24 may not be present. A detector array chip 6, may replace the anode array chip, bump bonded to the laser radar processor chip 8, and enclosed in a vacuum tube 15. In this alternate design a voltage between the detector array 6 and the photocathode 17 accelerates electrons into the detector array 6.
In an alternate sensor 3 design, the
The high pass filter 29 on the TFR leg 26 is connected to a voltage amplifier 30, to another high pass filter 31, to a Schmitt Trigger and then to a Memory 33. The Memory 33 is connected to the Peak Detector and Storage Circuitry 38. The Memory 33 is also connected so it can open the Switch 39. A ramp voltage or clock pulse line 53, from the drive and output electronics 4, pass thorough the Switch 39 to either an Analog Memory or a Pulse Counter 40. In alternate designs the ramp voltage or clock pulses are generated on the laser radar processor chip 8 or on a contiguous chip 22. The Memory 33 is also connected to a delay counter 42 which is connected to the 3DI-UDAR Amplitude Storage Circuitry 43.
All data storage circuitry, the Peak Detector/Storage 38, the Analog Memory/Pulse Counter 40 and the 3DI-UDAR Amplitude Storage Circuitry 43 is connected to the Output Amplifier 41. The Output Amplifier 41 is connected to the drive portion of the drive and output electronics 4 by the row line 46 and the column line 47. The output amplifier 41 is connected to the output portion of the drive and output electronics 4 by the first-return range line 48, the first-return amplitude peak line 49 and the range bin line 50. The delay counter 42 is connected to the drive and output electronics 4 by the bin count control 51 line and the 3DI-UDAR amplitude storage circuitry 43 is connected to the electronics 4 by the Read/Write and control line 52.
The PDAR imaging system functions as follows. A pulsed laser 1 (
For intense reflected signals or where the size of the laser radar processor processing-electronics unit cell 10 is large, a monolithic design can incorporate a small detector 7 into in the laser radar processor processing-electronics unit cell 10. Under these circumstances the detector current flows directly into the processing-electronics unit cell circuitry 10.
Under the circumstances that a lens has been etched into the detector substrate 14, or a lens array has been placed above the detector array 6, light is collected by these lenses and focused onto the detector 7 region producing electric currents in the detectors 7 of a larger magnitude than would occur without the lenses, increasing signal-to-noise ratio.
Under the circumstances that the detector 7 is actually a solid-state amplification detector, such as an avalanche diode, the electric current generated in the detector is larger than for an ordinary detector, increasing signal-to-noise ratio.
If the
An alternate-design
All the
The detector or anode currents are processed by the laser radar processor processing-electronics unit cell circuitry 10, for all sensors 3 as follows. Detector currents are input to the RTIA 28 and converted to a voltage. The voltage output from the RTIA is separated into two legs, the TFR leg 26 and the AR leg 27. The first component in both legs is a high pass filter 29 or 34. The high pass filters 29 and 34 are important to reduce noise and increase signal-to-noise ratio. 1/f noise is largest at low frequency and thermal noise is flat in frequency space. Noise reduction and signal distortion are traded off in this component.
The purpose of the TFR leg 26 is to measure the time, and hence the range, of the first return and to provide controlling input to the AR leg 27. Range is measured by turning off the switch 39 when the voltage amplitude from the RTIA 28, or equivalently the input current amplitude, is high enough. The voltage from the RTIA is filtered 29, amplified 30 and filtered 31 again. All filtering reduces noise and increases signal-to-noise ratio. The voltage signal then enters a Schmitt Trigger 32 which provides positive feedback for saturation to occur rapidly and therefore produces sharp rise times. Amplification and pulse shaping (and noise reducing filtering) is required to obtain a large enough, fast rising signal to trip the Memory 33 (typically a flip-flop with amplification) and open the Switch 39.
The switch 39 is connected to an Analog Memory (typically a capacitor) or a Range Counter (typically a modified high-speed shift register) 40 in two alternative designs. The ramp voltage or a clock pulses pass through the Switch 39 via line 53 and this signal is terminated when the Switch 39 is opened by the first laser pulse return. The voltage on the capacitor is converted to time by knowledge of the ramp characteristics or the number of counts on the counter is directly related to time and this time determines the range of the first return via multiplication by the velocity of light.
The Memory 33 is triggered at the peak of the laser pulse for the weakest signal. Since it is not uncommon to have target reflection coefficients vary by an order of magnitude or more on the same target at the same range, the memory in different pixels could be triggered at the peak of the pulse in one pixel and near the beginning of the pulse in the second pixel. Thus if there were no amplitude correction processing (correcting for the measurable fact that a large amplitude laser pulse triggers the Memory 33 closer to the beginning of the laser pulse than a smaller amplitude laser pulse), the uncertainty in range would be the pulse rise time. Amplitude correction for range is necessary for very accurate first-return range calculations for each pixel. The time of first return in each pixel is the basis for determining the times of all other returns. The time at which the Memory 33 changes state, relative to the beginning of a given laser pulse shape, can be measured as a function of pulse amplitude. By knowledge of this function and by knowledge of the amplitude of the reflected laser pulse, all transit times and hence all ranges can be corrected for laser pulse amplitude.
The purpose of the AR leg 27 is two fold. One branch, the AFR leg 45, processes the RTIA 28 signal voltage to determine a peak signal for the first return so that the range to the first-return object can be determined with high accuracy. The primary AR leg 27 carries the RTIA 28 signal voltage to the 3DI-UDAR Amplitude Storage Circuitry 43 where succeeding return signals can be stored. If the scenario timing is such that the first return signal can be stored in the 3DI-UDAR Amplitude Storage Circuitry 43 then the Peak Detector/Storage 38 can be eliminated. Alternatively if, in practice, the laser pulse rise time is short enough, or the range accuracy requirement is not too severe, it may also be possible to eliminate the Peak Detector/Storage 38. The Peak Detector/Storage 38 contains timing circuitry that shuts it off, at a specific time, after the Memory 33 is triggered so that other returns or reflections do not modify the peak signal. In one design the Peak Detector/Storage 38 is a typical Peak Detector with a storage capacitor. The Peak Detector could be made with just four CMOS transistors for example: The input is on the gate of one transistor with the storage capacitor at the source. The two other transistors act as switches to reset the gate and the storage capacitor each cycle.
In an alternate design the Peak Detector/Storage 38 is similar to the 3DI-UDAR Amplitude Storage Circuitry 43. In this latter Peak Detector/Storage 38 design, signal amplitude is constantly stored on a sequence of capacitors. The time interval for switching from one capacitor to the next is fixed to be much smaller than the laser pulse rise time. When all the capacitors are charged, the first capacitor in the sequence is overwritten with the new amplitude voltage signals and so on. The trigger signal from the Memory 33 turns on Peak Detector/Storage 38 timing circuitry which terminates the serpentine sequence of capacitor charging in a time interval predetermined to include the signal peak. A typical number of capacitors in the sequence is 20 and a typical capacitance is 0.25 pF for each storage capacitor.
With either Peak Detector/Storage circuitry 38, the single capacitor storage or the multiple capacitor storage, the AFR leg 45 is buffered 37 to prevent feedback to the sensitive TFR leg 26. Although the amplification chain for the RTIA 28 voltage, though leg 45, including the Peak Detector/Storage 38, is not necessarily linear throughout the range of laser-generated input currents, the laser pulse amplitude can be found from the measured nonlinear relationship. Thus output of the Peak Detector/Storage 38 can be used to find the amplitude of the laser pulse.
In the main AR leg 27 the RTIA 28 voltage is amplified 35 and buffered 36 and enters 3DI-UDAR Amplitude Storage Circuitry 43. This voltage varies in time as the laser photon pulse return signals vary in time. Buffering is required to prevent signals generated in the 3DI-UDAR Amplitude Storage Circuitry 43 from feeding back to the TFR leg 26 and triggering the Memory 38. As can be seen from U.S. Pat. No. 5,446,529,
The Delay Counter 42 determines when the capacitor charging begins. Amplitude storage could be going on all the time, as described above for the Peak Detector/Storage 38, so that amplitude information prior to the triggering of the Memory 33 is available, or capacitor charging (amplitude storage) could begin at a set time after the Memory 33 is triggered. The Delay Counter 42 is set at system start up by the Pre-load Bin Count Control 51 to stop the serpentine signal amplitude capture on the capacitors after a set time or set the time interval after the trigger of the Memory 33 at which the signal amplitude capture and storage should begin on the capacitors. When the Delay Counter 42 is set to start capacitor amplitude charging, after a certain time interval, charging is usually set to terminate when all capacitors have been charged.
After a time interval greater than the return time from the most distant target the Drive Electronics 4 begins sequencing through the laser radar processor processing-electronics unit cells 10, reading out the data. Each unit cell is selected in sequence by Row 46 and Column 47 selection pulses and the Output Amplifier 41 is turned on. The 3DI-UDAR Amplitude Storage Circuitry 43 is set for read by the Read/Write and Control 52. The Output Amplifier 41 contains one or more amplifiers (typically source followers) and switches the data storage components to their respective output lines. The Output Amplifier 41 drives the data signals to a chip output amplifier and multiplexer 12 which in turn drives the signals to the Drive and Output Electronics 4.
The first-return Peak 49 signal output from the Peak Detector/Storage 38, the first-return Range 48 output from the Analog Memory or Counter 40 and all the target-reflection, Range Bin 50 data from the 3DI-UDAR Amplitude Storage Circuitry 43 are processed by the image processing computer 4a. This data could be displayed as a 3D image or could be further processed to isolate targets. After the Drive and Output electronics 4 has processed the laser radar processor processing-electronics unit cell 10 output voltages, each unit cell 10 is reset by Drive Electronics 4 generated pulses, making this processing-electronics unit cell ready for the next laser pulse.
Stettner, Roger, Bailey, Howard W.
Patent | Priority | Assignee | Title |
10063849, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
10222458, | Aug 24 2016 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
10222475, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical imaging transmitter with brightness enhancement |
10241196, | Mar 16 2012 | Continental Autonomous Mobility US, LLC | Personal LADAR sensor |
10295670, | Mar 08 2013 | Continental Autonomous Mobility US, LLC | LADAR enabled impact mitigation system |
10338199, | Jul 05 2018 | LUMINAR TECHNOLOGIES, INC | Transceiver apparatus, method and applications |
10372138, | Nov 01 2010 | Continental Autonomous Mobility US, LLC | Flash ladar collision avoidance system |
10377373, | May 24 2013 | Continental Autonomous Mobility US, LLC | Automotive auxiliary LADAR sensor |
10401147, | May 10 2005 | Continental Autonomous Mobility US, LLC | Dimensioning system |
10458904, | Sep 28 2015 | Ball Aerospace & Technologies Corp. | Differential absorption lidar |
10481269, | Dec 07 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Rotating compact light ranging system |
10557926, | Jan 23 2013 | Continental Autonomous Mobility US, LLC | Modular ladar sensor |
10663586, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical imaging transmitter with brightness enhancement |
10732032, | Aug 09 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Scanning sensor array with overlapping pass bands |
10732264, | Mar 16 2012 | Continental Autonomous Mobility US, LLC | Personal ladar sensor |
10739189, | Aug 09 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Multispectral ranging/imaging sensor arrays and systems |
10760957, | Aug 09 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Bulk optics for a scanning array |
10802149, | Mar 08 2013 | Continental Autonomous Mobility US, LLC | LADAR enabled impact mitigation system |
10809359, | Aug 24 2016 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
10830892, | Dec 07 2017 | OUSTER, INC.; OUSTER, INC | Light ranging system with opposing circuit boards |
10921245, | Jun 08 2018 | Ball Aerospace & Technologies Corp | Method and systems for remote emission detection and rate determination |
10948572, | Aug 24 2016 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
10955532, | Jan 23 2013 | Continental Autonomous Mobility US, LLC | Modular LADAR sensor |
10969490, | Dec 07 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Light ranging system with opposing circuit boards |
11025885, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
11027726, | May 24 2013 | Continental Autonomous Mobility US, LLC | Automotive auxiliary LADAR sensor |
11086013, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Micro-optics for imaging module with multiple converging lenses per channel |
11131773, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Lidar unit with an optical link between controller and photosensor layer |
11150347, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Micro-optics for optical imager with non-uniform filter |
11175405, | May 15 2017 | OUSTER, INC ; SENSE PHOTONICS, INC | Spinning lidar unit with micro-optics aligned behind stationary window |
11178381, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
11190750, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical imaging system with a plurality of sense channels |
11196979, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
11202056, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system with multiple light emitters sharing a field of view of a pixel detector |
11287515, | Dec 04 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Rotating compact light ranging system comprising a stator driver circuit imparting an electromagnetic force on a rotor assembly |
11300665, | Dec 04 2018 | OUSTER, INC. | Rotating compact light ranging system |
11304777, | Oct 28 2011 | Navigate Surgical Technologies, Inc | System and method for determining the three-dimensional location and orientation of identification markers |
11340336, | Dec 07 2017 | OUSTER, INC.; OUSTER, INC | Rotating light ranging system with optical communication uplink and downlink channels |
11353556, | Dec 07 2017 | OUSTER, INC.; OUSTER, INC | Light ranging device with a multi-element bulk lens system |
11400877, | Mar 08 2013 | Continental Autonomous Mobility US, LLC | LADAR enabled impact mitigation system |
11422236, | Aug 24 2016 | OUSTER, INC. | Optical system for collecting distance information within a field |
11442075, | Jan 23 2019 | Ballistic projectile velocity measurement apparatus | |
11467597, | Oct 31 2011 | Continental Autonomous Mobility US, LLC | Flash LADAR collision avoidance system |
11473969, | Aug 09 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Channel-specific micro-optics for optical arrays |
11473970, | Aug 09 2018 | OUSTER, INC ; SENSE PHOTONICS, INC | Subpixel apertures for channels in a scanning sensor array |
11531095, | Apr 03 2019 | Analog Devices, Inc. | System and method for improved resolution in a LIDAR system |
11579266, | Jan 23 2013 | Continental Autonomous Mobility US, LLC | Modular ladar sensor |
11604255, | Mar 16 2012 | Continental Autonomous Mobility US, LLC | Personal LADAR sensor |
11627298, | Sep 24 2015 | OUSTER, INC. | Optical system for collecting distance information within a field |
11702022, | Mar 08 2013 | Continental Autonomous Mobility US, LLC | Ladar enabled impact mitigation system |
11733092, | Aug 09 2018 | OUSTER, INC. | Channel-specific micro-optics for optical arrays |
11841428, | Sep 25 2019 | Analog Devices International Unlimited Company | Storage for active ranging systems with write and overwrite specific instructions |
6882477, | Nov 10 1999 | Massachusetts Institute of Technology | Method and system for interference lithography utilizing phase-locked scanning beams |
7139067, | Sep 12 2003 | Textron Systems Corporation | Three-dimensional imaging with multiframe blind deconvolution |
7173250, | Jun 29 2004 | OXFORD INSTRUMENT TECHNOLOGIES OY; Oxford Instruments Technologies Oy | Drift-type detector with limited noise level |
7236235, | Jul 06 2004 | Topcon Positioning Systems, Inc | System and method for determining range in 3D imaging systems |
7453553, | Jul 06 2004 | Topcon Positioning Systems, Inc | System and method for determining range in 3D imaging systems |
7476840, | May 08 2006 | SLICEX, Inc. | Sensing light and sensing the state of a memory cell an aid of a switch controlled by a schmidt trigger |
7508313, | Feb 10 2000 | Siemens Aktiengesellschaft | Smoke detectors particularly ducted smoke detectors |
7551277, | Oct 23 2003 | SIEMENS SCHWEIZ AG | Particle monitors and method(s) therefor |
7554653, | Dec 01 2006 | Trimble AB | Multitarget |
7561255, | Apr 19 2002 | LITO TECHNOLOGIES INC | System for viewing objects at a fire scene and method of use |
7663086, | May 08 2006 | SLICEX, Inc. | Obtaining digital image of a scene with an imager moving relative to the scene |
7697748, | Jul 06 2004 | Topcon Positioning Systems, Inc | Method and apparatus for high resolution 3D imaging as a function of camera position, camera trajectory and range |
7724367, | Oct 23 2003 | Siemens Schweiz Aktiengesellschaft | Particle monitors and method(s) therefor |
7738098, | Oct 23 2003 | Siemens Schweiz Aktiengesellschaft | Particle monitors and method(s) therefor |
7929215, | Feb 20 2009 | Gencia Corporation | Field widening lens |
7961301, | May 09 2008 | Ball Aerospace & Technologies Corp | Flash LADAR system |
7991222, | Jul 06 2004 | Topcon Positioning Systems, Inc | Method and apparatus for high resolution 3D imaging as a function of camera position, camera trajectory and range |
8072581, | Jan 19 2007 | Rockwell Collins, Inc. | Laser range finding system using variable field of illumination flash lidar |
8077294, | Jan 17 2008 | Ball Aerospace & Technologies Corp. | Optical autocovariance lidar |
8119971, | Jan 17 2008 | Ball Corporation | Pulse data recorder in which a value held by a bit of a memory is determined by a state of a switch |
8130367, | Dec 08 2005 | Continental Autonomous Mobility US, LLC | Laser ranging, tracking and designation using 3-D focal planes |
8184276, | Dec 08 2008 | Continuous index of refraction compensation method for measurements in a medium | |
8198576, | Mar 28 2003 | AI-CORE TECHNOLOGIES, LLC | Three-dimensional LADAR module with alignment reference insert circuitry comprising high density interconnect structure |
8208131, | Jul 01 2010 | UNITED STATES of AMERICA, AS REPRESENTED BY THE SECRETARY OF THE ARMY | Digital registration of 3D laser radar data based on manually selected fiducials |
8232514, | Jan 17 2008 | Ball Aerospace & Technologies Corp. | Method using a switch and memory to count events |
8306273, | Dec 28 2009 | Ball Aerospace & Technologies Corp. | Method and apparatus for LIDAR target identification and pose estimation |
8314992, | Feb 20 2009 | Ball Aerospace & Technologies Corp. | Field widening lens |
8355117, | Dec 21 2005 | Ecole Polytechnique Federale de Lausanne | Method and arrangement for measuring the distance to an object |
8358404, | Dec 08 2005 | Continental Autonomous Mobility US, LLC | Laser ranging, tracking and designation using 3-D focal planes |
8467044, | Dec 08 2008 | Continuous index of refraction compensation method for measurements in a medium | |
8494687, | Mar 12 2010 | United States of America as represented by the Administrator of the National Aeronautics and Space Administration | Method for enhancing a three dimensional image from a plurality of frames of flash LIDAR data |
8547532, | Jul 06 2004 | Topcon Positioning Systems, Inc | System and method for determining range in 3D imaging systems |
8599303, | May 10 2005 | Continental Autonomous Mobility US, LLC | Dimensioning system |
8606496, | Dec 08 2005 | Continental Autonomous Mobility US, LLC | Laser ranging, tracking and designation using 3-D focal planes |
8692980, | Nov 01 2010 | Continental Autonomous Mobility US, LLC | Flash LADAR collision avoidance system |
8736818, | Aug 16 2010 | Ball Aerospace & Technologies Corp. | Electronically steered flash LIDAR |
8743176, | May 20 2009 | Continental Autonomous Mobility US, LLC | 3-dimensional hybrid camera and production system |
8744126, | Mar 07 2012 | Ball Aerospace & Technologies Corp. | Morphology based hazard detection |
8797512, | Sep 15 2011 | Continental Autonomous Mobility US, LLC | Automatic range corrected flash ladar camera |
8804101, | Mar 16 2012 | Continental Autonomous Mobility US, LLC | Personal LADAR sensor |
8878978, | May 10 2005 | Continental Autonomous Mobility US, LLC | Dimensioning system |
8925925, | Mar 05 2010 | Target system methods and apparatus | |
8947108, | Feb 24 2012 | Precision target methods and apparatus | |
9007569, | Aug 03 2012 | United States of America as represented by the Administrator of the National Aeronautics and Space Administration | Coherent doppler lidar for measuring altitude, ground velocity, and air velocity of aircraft and spaceborne vehicles |
9041915, | May 09 2008 | Ball Aerospace & Technologies Corp | Systems and methods of scene and action capture using imaging system incorporating 3D LIDAR |
9069061, | Jul 19 2011 | Ball Aerospace & Technologies Corp | LIDAR with analog memory |
9069080, | May 24 2013 | Continental Autonomous Mobility US, LLC | Automotive auxiliary ladar sensor |
9087387, | Dec 08 2005 | Continental Autonomous Mobility US, LLC | Laser ranging, tracking and designation using 3-D focal planes |
9110169, | Mar 08 2013 | Continental Autonomous Mobility US, LLC | LADAR enabled impact mitigation system |
9234964, | Jun 14 2012 | Electronics and Telecommunications Research Institute | Laser radar system and method for acquiring 3-D image of target |
9261357, | May 10 2005 | Continental Autonomous Mobility US, LLC | Dimensioning system |
9277204, | Jan 23 2013 | Continental Autonomous Mobility US, LLC | Modular LADAR sensor |
9336597, | Nov 08 2012 | Navigate Surgical Technologies, Inc | System and method for determining the three-dimensional location and orienation of identification markers |
9469416, | Mar 17 2014 | DM3 Aviation LLC | Airplane collision avoidance |
9759660, | Dec 13 2010 | PreSens—Precision Sensing GmbH | Sensor assembly, method, and measuring system for capturing the distribution of at least one variable of an object |
9783320, | Mar 17 2014 | DM3 Aviation LLC | Airplane collision avoidance |
9857472, | Jul 02 2013 | Electronics and Telecommunications Research Institute | Laser radar system for obtaining a 3D image |
9874639, | Dec 08 2005 | Continental Autonomous Mobility US, LLC | Laser ranging,tracking and designation using 3-D focal planes |
9992477, | Sep 24 2015 | OUSTER, INC ; SENSE PHOTONICS, INC | Optical system for collecting distance information within a field |
D617361, | Nov 18 2008 | RealFiction | 3D imaging device |
D662533, | Apr 30 2010 | Innovision Labs Co., Ltd. | Image device for projecting floating images in the air |
RE43722, | Mar 28 2003 | AI-CORE TECHNOLOGIES, LLC | Three-dimensional ladar module with alignment reference insert circuitry |
Patent | Priority | Assignee | Title |
4652766, | Dec 16 1985 | Lockheed Martin Corporation | Direct coupled charge injection readout circuit and readout method for an IR sensing charge injection device |
4862257, | Jul 07 1988 | Kaman Aerospace Corporation | Imaging lidar system |
5101108, | Dec 14 1988 | Hughes Aircraft Company | Split dynamic range using dual array sensor chip assembly |
5446529, | Mar 23 1992 | Continental Autonomous Mobility US, LLC | 3D imaging underwater laser radar |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 24 1999 | Advanced Scientific Concepts, Inc. | (assignment on the face of the patent) | / | |||
Mar 03 2000 | STETTNER, ROGER | ADVANCED SCIENTIFIC CONCEPTS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010605 | /0876 | |
Mar 03 2000 | BAILEY, HOWARD W | ADVANCED SCIENTIFIC CONCEPTS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010605 | /0876 | |
Mar 01 2016 | ADVANCED SCIENTIFIC CONCEPTS, INC | CONTINENTAL ADVANCED LIDAR SOLUTIONS US, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 038303 | /0394 | |
Dec 16 2016 | CONTINENTAL ADVANCED LIDAR SOLUTIONS US, INC | Continental Advanced Lidar Solutions US, LLC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 042098 | /0163 | |
Dec 02 2021 | Continental Advanced Lidar Solutions US, LLC | Continental Autonomous Mobility US, LLC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 061055 | /0940 |
Date | Maintenance Fee Events |
Jan 02 2006 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Feb 08 2010 | REM: Maintenance Fee Reminder Mailed. |
Jul 02 2010 | EXPX: Patent Reinstated After Maintenance Fee Payment Confirmed. |
Aug 31 2010 | M1558: Surcharge, Petition to Accept Pymt After Exp, Unintentional. |
Aug 31 2010 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Aug 31 2010 | PMFG: Petition Related to Maintenance Fees Granted. |
Aug 31 2010 | PMFP: Petition Related to Maintenance Fees Filed. |
Nov 27 2013 | M2553: Payment of Maintenance Fee, 12th Yr, Small Entity. |
Date | Maintenance Schedule |
Jul 02 2005 | 4 years fee payment window open |
Jan 02 2006 | 6 months grace period start (w surcharge) |
Jul 02 2006 | patent expiry (for year 4) |
Jul 02 2008 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 02 2009 | 8 years fee payment window open |
Jan 02 2010 | 6 months grace period start (w surcharge) |
Jul 02 2010 | patent expiry (for year 8) |
Jul 02 2012 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 02 2013 | 12 years fee payment window open |
Jan 02 2014 | 6 months grace period start (w surcharge) |
Jul 02 2014 | patent expiry (for year 12) |
Jul 02 2016 | 2 years to revive unintentionally abandoned end. (for year 12) |