A LiDAR-based 3-D point cloud measuring system includes a base, a housing, a plurality of photon transmitters and photon detectors contained within the housing, a rotary motor that rotates the housing about the base, and a communication component that allows transmission of signals generated by the photon detectors to external components. In several versions of the invention, the system includes a vertically oriented motherboard, thin circuit boards such as ceramic hybrids for selectively mounting emitters and detectors, a conjoined D-shaped lens array, and preferred firing sequences.
|
0. 27. A LiDAR-based sensor system comprising:
a base;
a head assembly;
a rotary component configured to rotate the head assembly with respect to the base along an axis of rotation;
a motherboard carried in the head assembly;
a lens positioned at a periphery of the head assembly;
a mirror positioned at the periphery of the head assembly;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards mounted to the motherboard;
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards mounted to the motherboard;
a processor coupled to the plurality of photon transmitters; and
a memory including processor executable code, wherein the processor executable code, upon execution by the processor, configures the processor to cause firing of fewer than the entire plurality of photon transmitters at a time.
0. 19. A LiDAR-based sensor system comprising:
a base having a head assembly and a rotary component configured to rotate the head assembly with respect to the base, the head assembly further having a circumference spaced apart from an axis of rotation of the head assembly;
a motherboard carried in the head assembly;
a lens positioned on the head assembly along the circumference of the head assembly;
a mirror positioned on the head assembly along the circumference of the head assembly;
a plurality of transmitters mounted to a plurality of emitter circuit boards carried on the head assembly for rotation with the head assembly, the plurality of transmitters positioned to transmit light pulses through the lens;
a plurality of detectors mounted to a plurality of detector circuit boards carried on the head assembly for rotation with the head assembly, the plurality of detectors positioned to receive reflected light pulses from one or more surfaces;
a processor coupled to the plurality of transmitters; and
a memory including processor executable code, wherein the processor executable code, upon execution by the processor, configures the processor to cause firing of fewer than the entire plurality of transmitters at a time.
0. 1. A LiDAR-based sensor system comprising:
a base;
head assembly;
a rotary component configured to rotate the head assembly with respect to the base, the rotation of the head assembly defining an axis of rotation;
an electrical motherboard carried in the head assembly, the motherboard defining a plane and being positioned substantially parallel to the axis of rotation;
a lens positioned on the head assembly on a first side of the motherboard;
a mirror positioned on the head assembly on a second side of the motherboard;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards being mounted directly to the motherboard; and
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards being mounted directly to the motherboard.
0. 2. The sensor system of
the lens comprises an emitter lens and a detector lens, the emitter lens and the detector lens being positioned adjacent one another; and
the mirror comprises an emitter mirror and a detector mirror;
wherein the emitter mirror is positioned within the head assembly to reflect light from the plurality of photon transmitters through the emitter lens, and the detector mirror is positioned within the head to reflect light received through the detector lens toward the plurality of detectors.
0. 3. The sensor system of
0. 4. The sensor system of
0. 5. The sensor system of
0. 6. The sensor system of
0. 7. The sensor system of
0. 8. The sensor system of
0. 9. The sensor system of
0. 10. The sensor system of
0. 11. The sensor system of
0. 12. The sensor system of
0. 13. The sensor system of
0. 14. The sensor system of
a first group forming a first portion of the first vertical stack and organized sequentially from a first top position to a first bottom position; and
a second group forming a remaining portion of the first vertical stack organized sequentially from a second top position to a second bottom position;
whereby the control component causes firing of the emitters to alternate between the first group and the second group, and further causes firing within the first group to proceed sequentially and firing within the second group to proceed sequentially.
0. 15. The sensor system of
0. 16. A LiDAR-based sensor system comprising:
a base;
head assembly;
a motor configured to rotate the head assembly with respect to the base, the rotation of the head assembly defining an axis of rotation;
an electrical motherboard carried in the head assembly;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards being mounted to the motherboard;
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards being mounted to the motherboard;
an emitter mirror supported within the head assembly;
a detector mirror supported within the head assembly; and
a conjoined D-shaped lens assembly, the lens assembly forming an emitter portion and a detector portion;
wherein the motherboard is a unitary component for mounting the plurality of emitter circuit boards and the plurality of detector circuit boards, the motherboard being positioned between the emitter mirror and the detector mirror on a first side and the lens assembly on the other side, the motherboard further having an opening to allow light to pass between the lens assembly and either the detector mirror or the emitter mirror;
whereby light transmitted by one of the plurality of emitters is reflected from the emitter mirror and passes through the emitter portion of the lens assembly, and light received by the detector portion of the lens assembly is reflected by the detector mirror and received by one of the plurality of detectors.
0. 17. The sensor system of
0. 18. The sensor system of
a control component for causing the firing of the plurality of emitters; and
further wherein there are n emitters in the plurality of emitters, the n emitters being positioned in a vertical stack from 1 to n, the plurality of emitters being divided into two groups, including a first group of emitters from 1 to n/2 and a second group of emitters from n/2+1 to n; wherein the control component causes the emitters to fire alternatingly between the first group and the second group, and to fire sequentially within each group such that emitter 1 and emitter n/2+1 fire sequentially.
0. 20. The sensor system of claim 19, wherein the processor is configured to cause the firing of only one of the plurality of transmitters at a time.
0. 21. The sensor system of claim 20, wherein the plurality of transmitters and the plurality of detectors form a plurality of transmitter-detector pairs, and wherein the processor is configured to cause only one transmitter-detector pair to be active at any time.
0. 22. The sensor system of claim 19, wherein each one of the transmitters from among the plurality of transmitters is physically adjacent to at least one other of the transmitters from among the plurality of transmitters, and wherein the processor is configured to cause the firing of the plurality of transmitters in a non-adjacent firing order, such that at no time do adjacent transmitters fire in consecutively in sequence.
0. 23. The sensor system of claim 19 wherein:
wherein there are n transmitters in the plurality of transmitters, the n transmitters being positioned in a sequence from 1 to n, the plurality of transmitters being divided into two groups, including a first group of transmitters from I to n/2 and a second group of transmitters from n/2+1 to n; and
wherein the processor is configured to cause the transmitters to fire alternatingly between the first group and the second group, and to fire sequentially within each group such that transmitter 1 and transmitter n/2+1 fire sequentially.
0. 24. The sensor system of claim 19, wherein the lens and the mirror are positioned along the circumference of the head assembly such that a center of gravity of the head assembly corresponds to the axis of rotation.
0. 25. The sensor system of claim 19, wherein the head assembly is configured to rotate at a rotational speed, and wherein the processor is configured to cause the firing of fewer than the entire plurality of transmitters according to the rotational speed.
0. 26. The sensor system of claim 25, wherein the processor is configured to start the firing of fewer than the entire plurality of transmitters upon determining that the rotational speed reaches a threshold.
0. 28. The sensor system of claim 27, wherein the processor is configured to cause the firing of only one photon transmitter at a time.
0. 29. The sensor system of claim 28, wherein the plurality of photon transmitters and the plurality of detectors form a plurality of transmitter-detector pairs, and wherein only one transmitter-detector pair is active at any time.
0. 30. The sensor system of claim 27, wherein the processor is configured to cause the firing of the photon transmitters in a non-adjacent firing order, such that at no time do adjacent photon transmitters fire in sequence.
0. 31. The sensor system of claim 27, wherein:
the plurality of transmitters includes n transmitters positioned in a sequence from 1 to n, the n transmitters divided into two groups, including a first group of transmitters from 1 to n/2 and a second group of transmitters from n/2+1 to n; and
wherein the processor is configured to cause the transmitters to fire alternatingly between the first group and the second group, and to fire sequentially within each group such that transmitter I and transmitter n/2+1 fire sequentially.
0. 32. The sensor system of claim 27, wherein the motherboard, the rotary component, the lens, and the mirror are enclosed within the head assembly, and wherein the lens and the mirror are positioned at the periphery of the head assembly such that a center of gravity of the head assembly corresponds to the axis of rotation of the rotary component.
0. 33. The sensor system of claim 27, wherein the rotary component is configured to rotate at a rotational speed, and wherein the processor is configured to cause the firing of fewer than the entire plurality of photon transmitters according to the rotational speed.
0. 34. The sensor system of claim 33, wherein the processor is configured to start the firing of fewer than the entire plurality of transmitters upon determining that the rotational speed reaches a threshold.
|
Through the use of DSP a dynamic power feature allows the system to increase the intensity of the laser emitters if a clear terrain reflection is not obtained by photo detectors (whether due to reflective surface, weather, dust, distance, or other reasons), and to reduce power to the laser emitters for laser life and safety reasons if a strong reflection signal is detected by photo detectors. A direct benefit of this feature is that the LiDAR system is capable of seeing through fog, dust, and heavy rain by increasing laser power dynamically and ignoring early reflections. The unit also has the capability to receive and decipher multiple returns from a single laser emission through digitization and analysis of the waveform generated by the detector as the signal generated from the emitter returns.
The LiDAR systems of
This highly detailed terrain map is then used to calculate obstacle avoidance vectors if required and to determine the maximum allowable speed given the terrain ahead. The LiDAR system identifies of size and distance of objects in view, including the vertical position and contour of a road surface. The anticipated offset of the vehicle from a straight, level path, either vertical or horizontal, at different distances is translated into the G-force that the vehicle will be subject to when following the proposed path at the current speed. That information can be used to determine the maximum speed that the vehicle should be traveling, and acceleration or braking commands are issued accordingly. In all cases the software seeks the best available road surface (and thus the best possible speed) still within the boundaries of a global positioning system (GPS) waypoint being traversed.
One version of the inventor's prior system as illustrated in
In the versions as illustrated in
It As shown in FIG. 20, it is also advantageous to fire only several lasers, or preferably just one, at a time 202. This is because of naturally occurring crosstalk, or system blinding that occurs when the laser beam encounters a retroreflector. Such retroreflectors are commonly used along the roadways. A single beam at a time system is thus resistant to retroreflector blinding, while a flash system could suffer severe image degradation as a result.
In addition to crosstalk concerns, firing single lasers at once while rotating at a high rate facilitates eye safety. The high powered lasers used with the present preferred versions of the invention would require protective eyewear if the system was used in a stationary fashion. Rotation of the system and firing fewer lasers at once for brief pulses allows high powered lasers to be used while still meeting eye safety requirements that do not require protective eyewear. In accordance with this aspect of the invention, the system employs a control component that does not allow the emitters to fire until the head has reached a desired minimal rotation speed.
Another advantage of firing only a small number of lasers at a time is the ability to share, or multiplex, the detection circuitry among several detectors. Since the detection circuitry consists of high speed Analog to Digital Converters (ADCs), such as those made by National Semiconductor, considerable cost savings can be had by minimizing the use of these expensive components.
In the preferred embodiment, the detectors are power cycled, such that only the desired detector is powered up at any one time. Then the signals can simply be multiplexed together. An additional benefit of power-cycling the detectors is that total system power consumption is reduced, and the detectors therefore run cooler and are therefore more sensitive.
A simple DC motor controller driving a high reliability brushed or brushless motor controls the rotation of the emitter/detectors. A rotary encoder feeds rotational position to the DSPs (or other microprocessor) that use the position data to determine firing sequence. Software and physical fail-safes ensure that no firing takes place until the system is rotating at a minimum RPM.
The LiDAR system of
The version described below with reference to
In a preferred version as illustrated in
The sample embodiment of
The hybrids 32 are mounted to the motherboard in a fan pattern that is organized about a central axis. In the version as shown, 32 hybrids are used in a pattern to create a field of view extending 10 degrees above and 30 degrees below the horizon and therefore the central axis extends above and below the ninth board 38, with 8 boards above and 23 boards below the central axis. In one version, each successive board is inclined an additional one and one-third degree with respect to the next adjacent board. The desired incremental and overall inclination may be varied depending on the number of hybrids used, the geometry of the mirrors and lenses, and the desired range of the system.
One of the features allowing for compact size and improved performance of the version of
One of the advantages of mounting emitters and detectors on individual hybrid boards is the ability to then secure the individual hybrid boards to the motherboard in a vertically aligned configuration. In the illustrated version, the detectors are positioned in a first vertical alignment along a first vertical axis while the emitters are positioned in a second vertical alignment along a second vertical axis, with the first and second vertical axes being parallel and next to one another. Thus, as best seen in
As further shown in
The density of emitter/detector pairs populated along the vertical FOV is intentionally variable. While 32 pairs of emitters and detectors are shown in the illustrated versions, the use of hybrids and a motherboard allows for a reduction in the number of emitters and detectors by simply removing or not installing any desired number of emitter/detector pairs. This variation of the invention cuts down on the number vertical lines the sensor produces, and thus reduce cost. It is feasible that just a few emitter/detector pairs will accomplish the goals of certain autonomous vehicles or mapping applications. For some uses increased density is desirable to facilitate seeing objects at further distances and with more vertical resolution. Other uses exploit the fact that there is a direct relationship between the number of emitter detector pairs and sensor cost, and do not need the full spread of vertical lasers to accomplish their sensor goals.
Alternatively, multiple emitters and detectors can be designed and mounted onto the hybrid boards at slightly different vertical angles, thus increasing the density of vertical FOV coverage in the same footprint. If, for example, two emitters and two detectors were mounted on each of the hybrids shown in
Another design feature of the preferred version is the vertical motherboard on which the main electronics that control the firing of the lasers and the capturing of returns are located. As noted above, the motherboard is mounted vertically, defining a plane that is preferably parallel to the central axis 13 (see
Another feature of the vertical motherboard design is its proximity inside the sensor head. In order to optimize space, the motherboard is positioned between the mirror and the lenses, as best seen in
This configuration allows the hybrid emitters to fire rearward into the first mirror 40, wherein the light then reflects off the mirror and travels through the hole 24 in the motherboard 20, through the lens 50 and so that the emitted light 60 travels out to the target 70. This configuration further increases the net focal length of the light path while retaining small size. Likewise the returning light 62 passes through the detector lens 52, through the hole 24 in the motherboard to the opposite mirror 52 and is reflected into the corresponding detector.
Another benefit of the vertical motherboard design is that it facilitates the goal of balancing the sensor head both statically and dynamically to avoid shimmy and vibration during operation. Most preferably, the various components are positioned to allow a near-balanced condition upon initial assembly that requires a minimum of final static and dynamic balancing counterweights. As best seen in
When the present invention is incorporated into an autonomous navigation or mobile mapping vehicle, GPS and inertial sensors are often included to locate the vehicle in space and correct for normal vehicle motion. Inertial sensors often include gyros, such as fiber optic gyros (FOG), and accelerometers. In one embodiment, there is a 6-axis inertial sensor system mounted in the LiDAR base and the signals from the gyros and accelerometers are output along with the LiDAR distance and intensity data.
The separate location of emitters' and detectors' optical paths can create a parallax problem. When the emitters and detectors are separated by a finite distance there always exists a “blind” region nearest to the sensor in which objects cannot be illuminated or detected. Likewise, at long range the emitter's laser light becomes misaligned with its corresponding detector and creates a similar blind spot. The parallax problem is best seen with reference to
This effect can be alleviated in one version of the invention by having two “D”-shaped lenses 50, 52 (see
Due to the complex nature of the optical propagation in lenses, a lens array is usually needed to correct for various aberrations that are commonly associated with any optical design. For the purpose of constructing a conjoint lens system to overcome the parallax problem described with respect to
The creation of D-shaped lenses and the use of a conjoined pair of D-shaped lens arrays, however, brings a potential signal loss.
By configuring the lenses in an ideal fashion as illustrated in
Another unique design consideration for the preferred implementation addresses the need to transfer power and signal up to the head, and receive signal and offer grounding down from the head. Off the shelf mercury-based rotary couplers are too unreliable and too big for this problem. In one embodiment, shown in
It is also desired to have the distance returns of the LiDAR scanner be as accurate as possible and be free of spurious images or returns. Firing multiple lasers at once can create a crosstalk condition where the light emitted from one laser inadvertently is detected by the detector of another laser, thus giving a false return. Thus, with reference to
A similar error can occur if adjacent lasers are fired in a sequential fashion. Thus, with reference to
In accordance with a preferred version of the invention, the emitters are fired in a non-adjacent single laser firing order. This means that only one emitter detector pair is active at any given time, and at no time do adjacent emitters and detectors fire in sequence. Most preferably there is as much distance as possible between the emitters that are fired in order. Thus, if there are 32 emitters in a vertical stack, the emitters would be assigned labels E1 representing the top-most emitter and then sequentially numbered through E32 representing the bottom emitter in the stack. Emitter E1 (at the top) would be fired first, followed by emitter E17 (in the middle of the stack), then E2, E18, E3, E19, and so on, ending with E16 and E32 before starting over again at the beginning This pattern begins with the top emitter and the middle emitter, dividing the stack into two groups. It then alternates firing one from each group, moving from the top of each half-stack and proceeding sequentially down each half-stack of emitters in an this alternating fashion and then repeating. This pattern ensures the largest possible distance between fired lasers, thereby reducing the chance of crosstalk.
While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.
Patent | Priority | Assignee | Title |
10871779, | Sep 26 2012 | Waymo LLC | Wide-view LIDAR with areas of special attention |
10983218, | Jun 01 2016 | VELODYNE LIDAR USA, INC | Multiple pixel scanning LIDAR |
11073617, | Mar 19 2016 | VELODYNE LIDAR USA, INC | Integrated illumination and detection for LIDAR based 3-D imaging |
11082010, | Nov 06 2018 | VELODYNE LIDAR USA, INC | Systems and methods for TIA base current detection and compensation |
11137480, | Jan 31 2016 | VELODYNE LIDAR USA, INC | Multiple pulse, LIDAR based 3-D imaging |
11294041, | Dec 08 2017 | VELODYNE LIDAR USA, INC | Systems and methods for improving detection of a return signal in a light ranging and detection system |
11402845, | Sep 26 2012 | Waymo LLC | Wide-view LIDAR with areas of special attention |
11550036, | Jan 31 2016 | VELODYNE LIDAR USA, INC | Multiple pulse, LIDAR based 3-D imaging |
11550056, | Jun 01 2016 | VELODYNE LIDAR USA, INC | Multiple pixel scanning lidar |
11561305, | Jun 01 2016 | VELODYNE LIDAR USA, INC | Multiple pixel scanning LIDAR |
11698443, | Jan 31 2016 | VELODYNE LIDAR USA, INC | Multiple pulse, lidar based 3-D imaging |
11703569, | May 08 2017 | VELODYNE LIDAR USA, INC | LIDAR data acquisition and control |
11796648, | Sep 18 2018 | VELODYNE LIDAR USA, INC | Multi-channel lidar illumination driver |
11808854, | Jun 01 2016 | VELODYNE LIDAR USA, INC | Multiple pixel scanning LIDAR |
11808891, | Mar 31 2017 | VELODYNE LIDAR USA, INC | Integrated LIDAR illumination power control |
11822012, | Jan 31 2016 | VELODYNE LIDAR USA, INC | Multiple pulse, LIDAR based 3-D imaging |
11874377, | Jun 01 2016 | VELODYNE LIDAR USA, INC | Multiple pixel scanning LIDAR |
11885916, | Dec 08 2017 | Velodyne LIDAR USA, Inc. | Systems and methods for improving detection of a return signal in a light ranging and detection system |
11885958, | Jan 07 2019 | VELODYNE LIDAR USA, INC | Systems and methods for a dual axis resonant scanning mirror |
11906670, | Jul 01 2019 | VELODYNE LIDAR USA, INC | Interference mitigation for light detection and ranging |
11933967, | Aug 22 2019 | Red Creamery, LLC | Distally actuated scanning mirror |
11971507, | Aug 24 2018 | VELODYNE LIDAR USA, INC | Systems and methods for mitigating optical crosstalk in a light ranging and detection system |
12061263, | Jan 07 2019 | VELODYNE LIDAR USA, INC | Systems and methods for a configurable sensor system |
12093052, | Sep 26 2012 | Waymo LLC | Wide-view LIDAR with areas of special attention |
12123950, | Feb 15 2016 | Red Creamery, LLC | Hybrid LADAR with co-planar scanning and imaging field-of-view |
RE48490, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition LiDAR system |
RE48491, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition lidar system |
RE48503, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition LiDAR system |
RE48504, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition LiDAR system |
RE48666, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition LiDAR system |
RE48688, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition LiDAR system |
Patent | Priority | Assignee | Title |
3064252, | |||
3636250, | |||
3686514, | |||
3781111, | |||
3897150, | |||
4179216, | May 31 1977 | Franz Plasser Bahnbaumaschinen-Industriegesellschaft m.b.H. | Apparatus for measuring the profile of a railroad tunnel |
4220103, | Aug 10 1978 | Aisin Seiki Kabushiki Kaisha | Auxiliary table for sewing machines of a free arm type |
4477184, | Jan 19 1979 | Nissan Motor Company, Limited | Obstacle detection system for use in vehicles |
4834531, | Oct 31 1985 | Energy Optics, Incorporated | Dead reckoning optoelectronic intelligent docking system |
4862257, | Jul 07 1988 | Kaman Aerospace Corporation | Imaging lidar system |
4952911, | May 18 1988 | Eastman Kodak Company | Scanning intrusion detection device |
5212533, | Nov 14 1990 | Kabushiki Kaisha Topcon | Light wave distance meter |
5691687, | Jul 03 1995 | ADMINSTRATOR OF THE NATIONAL AERONAUTICS AND SPACE ADMINISTRATION | Contactless magnetic slip ring |
5877688, | Apr 12 1995 | MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD | Thermal object measuring apparatus |
6034803, | Apr 30 1997 | QUANTAPOINT, INC | Method and apparatus for directing energy based range detection sensor |
6327806, | Sep 25 1996 | OPTICS RESEARCH HK LTD ; LEUPOLD & STEVENS, INC | Optical sighting devices |
6335789, | Feb 25 1998 | Honda Giken Kogyo Kabushiki Kaisha | Optical radar system |
6441889, | Nov 29 2000 | KUSTOM SIGNALS, INC | LIDAR with increased emitted laser power |
6504712, | Jun 01 1999 | Showa Denka K.K. | Heat sinks for CPUs for use in personal computers |
6593582, | May 11 2001 | SCIENCE & ENGINEERING SERVICES, INC | Portable digital lidar system |
6621764, | Apr 30 1997 | Weapon location by acoustic-optic sensor fusion | |
6636300, | Mar 18 1999 | Siemens Aktiengesellschaft | Spatially resolving range-finding system |
6646725, | Jul 11 2001 | Iowa Research Foundation | Multiple beam lidar system for wind measurement |
6650402, | Feb 10 2000 | Oceanit Laboratories, Inc. | Omni-directional cloud height indicator |
6682478, | Feb 08 2001 | Olympus Optical Co., Ltd. | Endoscope apparatus with an insertion part having a small outer diameter which includes and object optical system |
6789527, | Sep 04 2000 | Robert Bosch GmbH | Method for adaptively controlling knocking of a gasoline direct fuel injection internal combustion engine, and a corresponding device |
7030968, | Nov 24 2000 | Trimble AB | Device for the three-dimensional recording of a scene using laser emission |
7106424, | Mar 11 2003 | Rosemount Aerospace Inc. | Compact laser altimeter system |
7130672, | Sep 25 2001 | CRITISENSE LTD | Apparatus and method for monitoring tissue vitality parameters |
7190465, | Aug 30 2001 | Z + F Zoller & Froehlich GmbH | Laser measurement system |
7248342, | Feb 14 2003 | United States of America as represented by the Administrator of the National Aeronautics and Space Administration | Three-dimension imaging lidar |
7281891, | Feb 28 2003 | Qinetiq Limited | Wind turbine control having a lidar wind speed measurement apparatus |
7313424, | Mar 20 2002 | CRITISENSE LTD | Diagnosis of body metabolic emergency state |
7583364, | Mar 19 2004 | University Corporation for Atmospheric Research | High pulse-energy, eye-safe lidar system |
7640068, | Jul 03 2006 | Trimble AB | Surveying instrument and method of controlling a surveying instrument |
7969558, | Jul 13 2006 | VELODYNE LIDAR USA, INC | High definition lidar system |
20010017718, | |||
20020003617, | |||
20020117545, | |||
20030043363, | |||
20030090646, | |||
20030163030, | |||
20040150810, | |||
20040240706, | |||
20060132635, | |||
20060186326, | |||
20060197867, | |||
20070035624, | |||
20070201027, | |||
20070219720, | |||
20080013896, | |||
20080074640, | |||
20100020306, | |||
20100046953, | |||
20100198487, | |||
EP2177931, | |||
JP2001256576, | |||
JP2005297863, | |||
JP2006177843, | |||
JP36407, | |||
JP6288725, | |||
RE45854, | Jul 03 2006 | Faro Technologies, Inc. | Method and an apparatus for capturing three-dimensional data of an area of space |
WO2008008970, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 17 2011 | HALL, DAVID S | VELODYNE ACOUSTICS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055078 | /0217 | |
Dec 31 2015 | VELODYNE ACOUSTICS, INC | VELODYNE ACOUSTICS, LLC | CONVERSION | 055170 | /0774 | |
Dec 31 2015 | VELODYNE ACOUSTICS, LLC | VELODYNE LIDAR, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 055078 | /0443 | |
Dec 31 2015 | VELODYNE ACOUSTICS, INC | VELODYNE ACOUSTICS, INC | MERGER SEE DOCUMENT FOR DETAILS | 055078 | /0309 | |
Sep 11 2017 | Velodyne Lindar, Inc. | (assignment on the face of the patent) | / | |||
Sep 29 2020 | VELODYNE LIDAR USA, INC | VELODYNE LIDAR USA, INC | MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 054438 | /0260 | |
Sep 29 2020 | VELODYNE LIDAR, INC | VELODYNE LIDAR USA, INC | MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 054438 | /0260 | |
Sep 29 2020 | VL MERGER SUB INC | VELODYNE LIDAR USA, INC | MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 054438 | /0260 | |
May 09 2023 | VELODYNE LIDAR USA, INC | HERCULES CAPITAL, INC , AS AGENT | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 063593 | /0463 | |
Oct 25 2023 | HERCULES CAPITAL, INC | VELODYNE LIDAR USA, INC | RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT RECORDED AT REEL FRAME NO 063593 0463 | 065350 | /0801 |
Date | Maintenance Fee Events |
Sep 11 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Sep 12 2017 | SMAL: Entity status set to Small. |
Feb 22 2022 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Feb 22 2022 | M2555: 7.5 yr surcharge - late pmt w/in 6 mo, Small Entity. |
Date | Maintenance Schedule |
Apr 14 2023 | 4 years fee payment window open |
Oct 14 2023 | 6 months grace period start (w surcharge) |
Apr 14 2024 | patent expiry (for year 4) |
Apr 14 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Apr 14 2027 | 8 years fee payment window open |
Oct 14 2027 | 6 months grace period start (w surcharge) |
Apr 14 2028 | patent expiry (for year 8) |
Apr 14 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Apr 14 2031 | 12 years fee payment window open |
Oct 14 2031 | 6 months grace period start (w surcharge) |
Apr 14 2032 | patent expiry (for year 12) |
Apr 14 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |