A LiDAR-based 3-D point cloud measuring system includes a base, a housing, a plurality of photon transmitters and photon detectors contained within the housing, a rotary motor that rotates the housing about the base, and a communication component that allows transmission of signals generated by the photon detectors to external components. In several versions of the invention, the system includes a vertically oriented motherboard, thin circuit boards such as ceramic hybrids for selectively mounting emitters and detectors, a conjoined D-shaped lens array, and preferred firing sequences.

Patent
   RE48491
Priority
Jul 13 2006
Filed
Sep 11 2017
Issued
Mar 30 2021
Expiry
Jul 13 2027

TERM.DISCL.
Assg.orig
Entity
Large
2
575
currently ok
0. 25. A LiDAR-based sensor system comprising:
a base;
a head assembly;
a rotary component configured to rotate the head assembly with respect to the base along an axis of rotation;
a motherboard carried in the head assembly;
a lens positioned at a periphery of the head assembly;
a mirror positioned at the periphery of the head assembly;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards mounted to the motherboard; and
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards mounted to the motherboard,
wherein the rotary component includes a rotary coupler that is configured to transmit a signal between the base and the head assembly through the center of the rotary component.
0. 19. A LiDAR-based sensor system comprising:
a base having a head assembly and a rotary component configured to rotate the head assembly with respect to the base, the head assembly further having a circumference spaced apart from an axis of rotation of the head assembly;
an electrical motherboard carried in the head assembly;
a lens positioned on the head assembly along the circumference of the head assembly;
a mirror positioned on the head assembly along the circumference of the head assembly;
a plurality of transmitters carried on the head assembly for rotation with the head assembly, the plurality of transmitters positioned to transmit light pulses through the lens; and
a plurality of detectors carried on the head assembly for rotation with the head assembly, the plurality of detectors positioned to receive the light pulses after reflection from one or more surfaces,
wherein the rotary component includes a rotary coupler that is configured to transmit a signal between the base and the head assembly through the center of the rotary component.
0. 1. A LiDAR-based sensor system comprising:
a base;
head assembly;
a rotary component configured to rotate the head assembly with respect to the base, the rotation of the head assembly defining an axis of rotation;
an electrical motherboard carried in the head assembly, the motherboard defining a plane and being positioned substantially parallel to the axis of rotation;
a lens positioned on the head assembly on a first side of the motherboard;
a mirror positioned on the head assembly on a second side of the motherboard;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards being mounted directly to the motherboard; and
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards being mounted directly to the motherboard.
0. 2. The sensor system of claim 1, wherein
the lens comprises an emitter lens and a detector lens, the emitter lens and the detector lens being positioned adjacent one another; and
the mirror comprises an emitter mirror and a detector mirror;
wherein the emitter mirror is positioned within the head assembly to reflect light from the plurality of photon transmitters through the emitter lens, and the detector mirror is positioned within the head to reflect light received through the detector lens toward the plurality of detectors.
0. 3. The sensor system of claim 2, further comprising a unitary support structure, the motherboard, detector lens, emitter lens, detector mirror, and emitter mirror all being secured to the unitary support structure.
0. 4. The sensor system of claim 2, wherein the plurality of emitters are oriented to transmit light from the second side of the motherboard toward the emitter mirror.
0. 5. The sensor system of claim 4, wherein the motherboard comprises a central opening, the central opening being positioned to allow light from the emitters to pass from emitter mirror through the central opening and toward the emitter lens.
0. 6. The sensor system of claim 5, wherein the central opening is further positioned to allow light to pass from the detector lens through the central opening and toward the detector mirror.
0. 7. The sensor system of claim 2, wherein the plurality of emitter circuit boards are secured to the motherboard to form a first vertical stack.
0. 8. The sensor system of claim 7, wherein the first vertical stack of emitter circuit boards forms an angularly fanned array.
0. 9. The sensor system of claim 7, wherein the plurality of detector circuit boards are secured to the motherboard to form a second vertical stack, the first vertical stack of emitter circuit boards being positioned substantially parallel to the second vertical stack of detector circuit boards.
0. 10. The sensor system of claim 9, wherein the second vertical stack of detector circuit boards forms an angularly fanned array.
0. 11. The sensor system of claim 2, wherein the emitter lens comprises a first D-shaped lens and the detector lens comprises a second D-shaped lens, a respective vertical side of each of the first D-shaped lens and the second D-shaped lens being positioned closely adjacent one another to form a conjoined D-shaped lens array.
0. 12. The sensor system of claim 11, wherein the first D-shaped lens comprises a first plurality of lenses, and wherein the second D-shaped lens comprises a second plurality of lenses.
0. 13. The sensor system of claim 2, wherein the plurality of emitter circuit boards are secured to the motherboard to form a first vertical stack, the first vertical stack being divided into at least two groups of emitters, each of the at least two groups comprising several emitters from the plurality of emitters such that the at least two groups form non-overlapping subsets of the plurality of emitters, the sensor further having a control component to control the firing of the emitters such that one emitter is fired at a time, the control component further causing firing from one of the at least two groups and then the other of the at least two groups in an alternating fashion.
0. 14. The sensor system of claim 13, wherein the at least two groups comprises:
a first group forming a first portion of the first vertical stack and organized sequentially from a first top position to a first bottom position; and
a second group forming a remaining portion of the first vertical stack organized sequentially from a second top position to a second bottom position;
whereby the control component causes firing of the emitters to alternate between the first group and the second group, and further causes firing within the first group to proceed sequentially and firing within the second group to proceed sequentially.
0. 15. The sensor system of claim 2, wherein the rotary component further comprises a capacitive coupler.
0. 16. A LiDAR-based sensor system comprising:
a base;
head assembly;
a motor configured to rotate the head assembly with respect to the base, the rotation of the head assembly defining an axis of rotation;
an electrical motherboard carried in the head assembly;
a plurality of photon transmitters mounted to a plurality of emitter circuit boards, the plurality of emitter circuit boards being mounted to the motherboard;
a plurality of detectors mounted to a plurality of detector circuit boards, the plurality of detector circuit boards being mounted to the motherboard;
an emitter mirror supported within the head assembly;
a detector mirror supported within the head assembly; and
a conjoined D-shaped lens assembly, the lens assembly forming an emitter portion and a detector portion;
wherein the motherboard is a unitary component for mounting the plurality of emitter circuit boards and the plurality of detector circuit boards, the motherboard being positioned between the emitter mirror and the detector mirror on a first side and the lens assembly on the other side, the motherboard further having an opening to allow light to pass between the lens assembly and either the detector mirror or the emitter mirror;
whereby light transmitted by one of the plurality of emitters is reflected from the emitter mirror and passes through the emitter portion of the lens assembly, and light received by the detector portion of the lens assembly is reflected by the detector mirror and received by one of the plurality of detectors.
0. 17. The sensor system of claim 16, wherein the motherboard defines a plane that is parallel to the axis of rotation.
0. 18. The sensor system of claim 17, further comprising:
a control component for causing the firing of the plurality of emitters; and
further wherein there are n emitters in the plurality of emitters, the n emitters being positioned in a vertical stack from 1 to n, the plurality of emitters being divided into two groups, including a first group of emitters from 1 to n/2 and a second group of emitters from n/2+1 to n; wherein the control component causes the emitters to fire alternatingly between the first group and the second group, and to fire sequentially within each group such that emitter 1 and emitter n/2+1 fire sequentially.
0. 20. The sensor system of claim 19, wherein the rotary coupler is further configured to provide power and ground.
0. 21. The sensor system of claim 19, wherein the rotary coupler is configured to:
provide power from the base to the head assembly; and
provide grounding down to the base from the head assembly.
0. 22. The sensor system of claim 19, wherein the signal is configured to communicate serial commands from the base to the head assembly.
0. 23. The sensor system of claim 22, wherein the serial commands include commands to limit horizontal field of view, fire all the transmitters at full power, and update firmware.
0. 24. The sensor system of claim 19, wherein the rotary coupler includes:
a rotary transformer configured to send power and the signal from the base to the head assembly; and
a capacitive coupler configured to receive the signal and provide grounding down to the base from the head assembly.
0. 26. The sensor system of claim 25, wherein the rotary coupler is further configured to provide power and ground.
0. 27. The sensor system of claim 25, wherein the rotary coupler is configured to:
provide power from the base to the head assembly; and
provide grounding down to the base from the head assembly.
0. 28. The sensor system of claim 25, wherein the signal is configured to communicate serial commands from the base to the head assembly.
0. 29. The sensor system of claim 28, wherein the serial commands include commands to limit horizontal field of view, fire all the photon transmitters at full power, and update firmware.
0. 30. The sensor system of claim 25, wherein the rotary coupler includes:
a rotary transformer configured to send power and the signal from the base to the head assembly; and
a capacitive coupler configured to receive the signal and provide grounding down to the base from the head assembly.

This application is a reissue continuation of application Ser. No. 15/180,580, filed Jun. 13, 2016, which is an application for reissue of U.S. Pat. No. 8,767,190, issued Jul. 1, 2014, which claims the benefit of U.S. provisional application Ser. No. 61/345,505 filed May 17, 2010 and which is a continuation-in-part of U.S. application Ser. No. 11/777,802, now U.S. Pat. No. 7,969,558, filed Jul. 13, 2007, and further which claims the benefit of U.S. provisional application Ser. No. 60/807,305 filed Jul. 13, 2006, and U.S. provisional application Ser. No. 61/345,505 filed May 17, 2010. Notice: more than one reissue application has been filed for the reissue of U.S. Pat. No. 8,767,190. The reissue applications are U.S. application Ser. No. 15/180,580, filed Jun. 13, 2016; and U.S. application Ser. Nos. 15/700,543, 15/700,558, 15/700,571, 15/700,836, 15/700,844, 15/700,959, and 15/700,965, each of which was filed on Sep. 11, 2017; and U.S. application Ser. No. 16/912,648, filed Jun. 25, 2020. The contents of each of the foregoing applications are hereby incorporated by reference.

The present invention concerns the use of light pulses that are transmitted, reflected from external objects, and received by a detector to locate the objects in the field of view of the transmitter. By pulsing a laser emitter and receiving the reflection, the time required for the pulse of light to return to the detector can be measured, thereby allowing a calculation of the distance between the emitter and the object from which the pulse was reflected.

When multiple pulses are emitted in rapid succession, and the direction of those emissions is varied, each distance measurement can be considered a pixel, and a collection of pixels emitted and captured in rapid succession (called a “point cloud”) can be rendered as an image or analyzed for other reasons such as detecting obstacles. Viewers that render these point clouds can manipulate the view to give the appearance of a 3-D image.

In co-pending application Ser. No. 11/777,802, the applicant described a variety of systems for use in creating such point cloud images using Laser Imaging Detection and Ranging (LiDAR). In one version, the LiDAR system was used for terrain mapping and obstacle detection, and incorporated as a sensor for an autonomous vehicle. An exemplary LiDAR system included eight assemblies of eight lasers each as shown in FIG. 1, or two assemblies of 32 lasers each forming a 64-element LiDAR system as shown in FIG. 2. Yet other numbers of lasers or detectors are possible, and in general the LiDAR was employed in an assembly configured to rotate at a high rate of speed in order to capture a high number of reflected pulses in a full circle around the LiDAR sensor.

The preferred examples of the present invention described further below build on the inventor's prior work as described above, incorporating several improvements to reduce the overall size and weight of the sensor, provide better balance, reduce crosstalk and parallax, and provide other advantages.

The present invention provides a LiDAR-based 3-D point cloud measuring system. An example system includes a base, a housing, a plurality of photon transmitters and photon detectors contained within the housing, a rotary motor that rotates the housing about the base, and a communication component that allows transmission of signals generated by the photon detectors to external components.

In one version of the invention, the system provides 32 emitter/detector pairs aligned along a vertical axis within a housing that spins to provide a 360 degree field of view. The emitters may be aligned along a first axis, with the detectors aligned along a second axis adjacent to the first.

In a preferred implementation, the emitters and detectors are mounted on thin circuit boards such as ceramic hybrid boards allowing for installation on a vertical motherboard for a vertical configuration, improved alignment, and other advantages. The motherboard, in one version is formed with a hole in which the emitters fire rearward into a mirror, reflecting the emitted light through the hole and through lenses adjacent the motherboard.

In certain configurations, the system employs a conjoint lens system that reduces or eliminates the parallax problem that may arise with the use of separate emitter and detector optics.

In still further examples of the invention, the emitters fire in a non-adjacent pattern, and most preferably in a pattern in which sequentially fired lasers are physically distant from one another in order to reduce the likelihood of crosstalk.

Preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings:

FIG. 1 is a front view of a rotating LiDAR system.

FIG. 2 is a perspective view of an alternate LiDAR system.

FIG. 3 is a perspective view of a preferred LiDAR system, showing an exemplary field of view of the laser emitters.

FIG. 4 is a side view of the preferred LiDAR system of FIG. 3.

FIG. 5 is a side view of the LiDAR system in accordance with FIG. 4, shown with the housing removed.

FIG. 6 is a perspective view of a hybrid containing a preferred detector.

FIG. 7 is a perspective view of a hybrid containing a preferred emitter.

FIG. 8 is a back perspective view of the LiDAR system as shown in FIG. 5.

FIG. 9 is a top perspective view of the LiDAR system as shown in FIG. 5.

FIG. 10 is an exemplary view of a LiDAR system with a potential parallax problem.

FIG. 11 is an exemplary front view of a lens assembly.

FIG. 12 is a sectional view of a lens assembly, taken along line A-A in FIG. 11.

FIG. 13 is a sectional view of an alternate lens assembly, taken along line A-A in FIG. 11.

FIG. 14 is a representative view of a conjoined D-shaped lens solving the parallax problem of FIG. 10.

FIG. 15 is a front view of the LiDAR system as shown in FIG. 5.

FIG. 16 is an exemplary view of a rotary coupler for coupling a housing to a rotating head assembly.

FIG. 17 is an illustration of a potential crosstalk problem.

FIG. 18 is an illustration of a further potential crosstalk problem.

Exemplary LiDAR systems are shown in FIGS. 1 and 2. In each case, a rotating housing fires light pulses that reflect from objects so that the return reflections may be detected by detectors within the rotating housing. By rotating the housing, the system provides a 360-degree horizontal field of view (FOV) and, depending on the number and orientation of lasers within the housing, a desired vertical field of view. The system is typically mounted on the top center of a vehicle, giving it a clear view in all directions, and rotates at a rate of about 10 Hz (600 RPM), thereby providing a high point cloud refresh rate, such high rate being advantageous for autonomous navigation at higher speeds. In other versions, the spin rate is within a range of about 5 to 20 Hz (300-1200 RPM). At this configuration, the system can collect approximately 2.56 million time of flight (TOF) distance points per second. The system therefore provides the unique combination of 360 degree FOV, high point cloud density, and high refresh rate. The standard deviation of TOF distance measurements is equal to or less than 2 cm. The LiDAR system may incorporate an inertial navigation system (INS) sensor system mounted on it to report x, y, z deviations and pitch, roll, and yaw of the unit that is used by navigational computers to correct for these deviations.

Through the use of DSP a dynamic power feature allows the system to increase the intensity of the laser emitters if a clear terrain reflection is not obtained by photo detectors (whether due to reflective surface, weather, dust, distance, or other reasons), and to reduce power to the laser emitters for laser life and safety reasons if a strong reflection signal is detected by photo detectors. A direct benefit of this feature is that the LiDAR system is capable of seeing through fog, dust, and heavy rain by increasing laser power dynamically and ignoring early reflections. The unit also has the capability to receive and decipher multiple returns from a single laser emission through digitization and analysis of the waveform generated by the detector as the signal generated from the emitter returns.

The LiDAR systems of FIGS. 1 and 2 report data in the form of range and intensity information via Ethernet (or similar output) to a master navigational system. Using standard trigonometry, the range data is converted into x and y coordinates and a height value. The height value can be corrected for the vehicle's pitch and roll so the resulting map is with reference to the horizontal plane of the vehicle. The map is then “moved” in concert with the vehicle's forward or turning motion. Thus, the sensor's input is cumulative and forms an ultra-high-density profile map of the surrounding environment.

This highly detailed terrain map is then used to calculate obstacle avoidance vectors if required and to determine the maximum allowable speed given the terrain ahead. The LiDAR system identifies of size and distance of objects in view, including the vertical position and contour of a road surface. The anticipated offset of the vehicle from a straight, level path, either vertical or horizontal, at different distances is translated into the G-force that the vehicle will be subject to when following the proposed path at the current speed. That information can be used to determine the maximum speed that the vehicle should be traveling, and acceleration or braking commands are issued accordingly. In all cases the software seeks the best available road surface (and thus the best possible speed) still within the boundaries of a global positioning system (GPS) waypoint being traversed.

One version of the inventor's prior system as illustrated in FIG. 1 includes 64 emitter/detector (i.e. laser diode/photo diode) pairs divided into eight groups of eight. The system shown in FIG. 2 also includes 64 emitter/detector pairs, but in a configuration of 2 assemblies of 32 pairs. It is also possible to “share” a single detector among several lasers by focusing several detection regions onto a single detector, or by using a single, large detector. By firing a single laser at a time, there would be no ambiguity as to which laser is responsible for a return signal. Conversely, one could also sub-divide a single laser beam into several smaller beams. Each beam would be focused onto its own detector. In any event, such systems are still considered emitter-detector pairs.

In the versions as illustrated in FIGS. 1 and 2, the laser diode is preferably an OSRAM 905 nm emitter, and the photo diode is preferably an Avalanche variety. More particularly, in the preferred version each one of the detectors is an avalanche photodiode detector. The lenses are preferably UV treated to block sunlight, or employ a separate UV lens filter in the optical path. Each pair is preferably physically aligned in ⅓° increments, ranging from approximately 2° above horizontal to approximately 24° below horizontal. Each of the emitter/detector pairs are controlled by one or more DSPs (or, in some versions, field programmable gate arrays, or FPGAs, or other microprocessor), which determines when they will fire, determines the intensity of the firing based on the previous return, records the time-of-flight, calculates height data based time-of-flight and angular alignment of each pair. Results, including multiple returns if any, are transmitted via Ethernet to the master navigational computer via a rotary coupling.

It is also advantageous to fire only several lasers, or preferably just one, at a time. This is because of naturally occurring crosstalk, or system blinding that occurs when the laser beam encounters a retroreflector. Such retroreflectors are commonly used along the roadways. A single beam at a time system is thus resistant to retroreflector blinding, while a flash system could suffer severe image degradation as a result.

In addition to crosstalk concerns, firing single lasers at once while rotating at a high rate facilitates eye safety. The high powered lasers used with the present preferred versions of the invention would require protective eyewear if the system was used in a stationary fashion. Rotation of the system and firing fewer lasers at once for brief pulses allows high powered lasers to be used while still meeting eye safety requirements that do not require protective eyewear. In accordance with this aspect of the invention, the system employs a control component that does not allow the emitters to fire until the head has reached a desired minimal rotation speed.

Another advantage of firing only a small number of lasers at a time is the ability to share, or multiplex, the detection circuitry among several detectors. Since the detection circuitry consists of high speed Analog to Digital Converters (ADCs), such as those made by National Semiconductor, considerable cost savings can be had by minimizing the use of these expensive components.

In the preferred embodiment, the detectors are power cycled, such that only the desired detector is powered up at any one time. Then the signals can simply be multiplexed together. An additional benefit of power-cycling the detectors is that total system power consumption is reduced, and the detectors therefore run cooler and are therefore more sensitive.

A simple DC motor controller driving a high reliability brushed or brushless motor controls the rotation of the emitter/detectors. A rotary encoder feeds rotational position to the DSPs (or other microprocessor) that use the position data to determine firing sequence. Software and physical fail-safes ensure that no firing takes place until the system is rotating at a minimum RPM.

FIG. 2 illustrates a perspective view of a 64 emitter/detector pair LiDAR component 150. The component 150 includes a housing 152 that is opened on one side for receiving a first LiDAR system 154 located above a second LiDAR system 156. The second LiDAR system 156 is positioned to have line of sight with a greater angle relative to horizontal than the first LiDAR system 154. The housing 152 is mounted over a base housing section 158.

The LiDAR system of FIG. 2 includes a magnetic rotor and stator. A rotary coupling, such as a three-conductor Mercotac model 305, passes through the center of the base 158 and the rotor. The three conductors facilitated by the rotary coupling are power, signal, and ground. A bearing mounts on the rotary coupling. A rotary encoder has one part mounted on the rotary coupling and another part mounted on the base section 158 of the housing 152. The rotary encoder, such as a U.S. Digital Model number E65-1000-750-I-PKG1 provides information regarding to rotary position of the housing 152. The magnetic rotor and stator cause rotary motion of the base section 158 and thus the housing 152 about the rotary coupling.

The version described below with reference to FIGS. 3-16 is generally referred to as an High Definition LiDAR 32E (HDL-32E) and operates on the same foundational principles as the sensors of FIGS. 1 and 2 in that a plurality (in this embodiment up to 32) of laser emitter/detector pairs are aligned along a vertical axis with the entire head spinning to provide a 360 degrees horizontal field of view (FOV). Each laser issues light pulses (in this version, 5 ns pulses) that are analyzed for time-of-flight distance information (called a “distance pixel” or “return”). Like the system of FIG. 2, the system reports returns in Ethernet packets, providing both distance and intensity (i.e. the relative amount of light received back from the emitter) information for each return. The sample system reports approximately 700,000 points per second. While all or any subset of the features described above with respect to FIGS. 1 and 2 may be incorporated into the version described below with respect to FIGS. 3-16, alternate embodiments of the invention may optionally include the additional aspects as described in detail below.

In a preferred version as illustrated in FIG. 3, the cylindrical sensor head 10 is about 3.5 inches in diameter and the unit has an overall height of 5.6 inches and weighs about 2.4 pounds. By contrast, the HDL-64E (shown in FIG. 2) is 8 inches in diameter by approximately one foot tall, and weighs about 29 pounds. This reduction in size is the result of several inventive improvements, as described more fully below.

The sample embodiment of FIG. 3 can be built with a variable number of lasers, aligned over a vertical FOV 12 of +10 to −30 degrees as best seen in FIG. 4. The vertical FOV may be made larger or smaller, as desired, by adjusting the number or orientation of the emitters and detectors. When using the emitters as described and orienting them as described, the range is approximately 100 meters. The head 10 is mounted on a fixed platform 14 having a motor configured such that it preferably spins at a rate of 5 Hz to 20 Hz (300-1200 RPM). The sample system uses 905 nm laser diodes (although other frequencies such as 1550 nm could be used) and is Class 1 eye safe.

FIG. 5 illustrates the same version as shown in FIGS. 3 and 4, though without the outer housing covering the internal components. In general, and as discussed more fully below, the system includes a main motherboard 20 supporting a plurality of detector hybrids 32 and emitter hybrids (not visible in FIG. 5). The emitters fire back toward the rear of the system, where the pulses are reflected from a mirror and then are directed through a lens 50. Return pulses pass through a lens, are reflected by a mirror 40, then directed to the detectors incorporated into the hybrids 32. The motherboard 20 and mirror 40 are mounted to a common frame 22 providing common support and facilitating alignment.

The hybrids 32 are mounted to the motherboard in a fan pattern that is organized about a central axis. In the version as shown, 32 hybrids are used in a pattern to create a field of view extending 10 degrees above and 30 degrees below the horizon and therefore the central axis extends above and below the ninth board 38, with 8 boards above and 23 boards below the central axis. In one version, each successive board is inclined an additional one and one-third degree with respect to the next adjacent board. The desired incremental and overall inclination may be varied depending on the number of hybrids used, the geometry of the mirrors and lenses, and the desired range of the system.

One of the features allowing for compact size and improved performance of the version of FIG. 3 is the use of thin circuit boards such as ceramic hybrid boards for each of the emitters and detectors. An exemplary detector circuit board 32 is shown in FIG. 6; an exemplary emitter circuit board 30 is shown in FIG. 7. In the preferred example, the thin circuit boards are in the form of ceramic hybrid boards that are about 0.015 inches thick, with only one emitter mounted on each emitter board, and only one detector mounted on each detector board. In other versions the thin circuit boards may be formed from other materials or structures instead of being configured as ceramic hybrids.

One of the advantages of mounting emitters and detectors on individual hybrid boards is the ability to then secure the individual hybrid boards to the motherboard in a vertically aligned configuration. In the illustrated version, the detectors are positioned in a first vertical alignment along a first vertical axis while the emitters are positioned in a second vertical alignment along a second vertical axis, with the first and second vertical axes being parallel and next to one another. Thus, as best seen in FIGS. 5 and 8, the hybrid boards carrying the emitters and detectors are mounted in vertical stacks that allow the sensor head to have a smaller diameter than a differently configured sensor having emitters and detectors positioned about the circumference of the system. Accordingly, the configuration reduces the overall size and requires less energy for spinning by moving more of the weight toward the center of the sensor.

As further shown in FIG. 8, the preferred version incorporates a plurality of detectors (in this case, 32 of them) mounted to an equal number of detector hybrids 32. The system likewise has the same number of emitters mounted to an equal number of emitter hybrids 30. In the preferred version, the system therefore has one emitter per hybrid and one detector per hybrid. In other versions this may be varied, for example to incorporate multiple emitters or detectors on a single hybrid. The emitter and detector hybrids are connected to a common motherboard 20, which is supported by a frame 22. The motherboard has a central opening 24 that is positioned to allow emitted and received pulses to pass through the motherboard. Because the lenses are positioned over the middle of the motherboard, the central opening is configured to be adjacent the lenses to allow light to pass through the portion of the motherboard that is next to the lenses.

The density of emitter/detector pairs populated along the vertical FOV is intentionally variable. While 32 pairs of emitters and detectors are shown in the illustrated versions, the use of hybrids and a motherboard allows for a reduction in the number of emitters and detectors by simply removing or not installing any desired number of emitter/detector pairs. This variation of the invention cuts down on the number vertical lines the sensor produces, and thus reduce cost. It is feasible that just a few emitter/detector pairs will accomplish the goals of certain autonomous vehicles or mapping applications. For some uses increased density is desirable to facilitate seeing objects at further distances and with more vertical resolution. Other uses exploit the fact that there is a direct relationship between the number of emitter detector pairs and sensor cost, and do not need the full spread of vertical lasers to accomplish their sensor goals.

Alternatively, multiple emitters and detectors can be designed and mounted onto the hybrid boards at slightly different vertical angles, thus increasing the density of vertical FOV coverage in the same footprint. If, for example, two emitters and two detectors were mounted on each of the hybrids shown in FIGS. 6 and 7 with slight vertical offsets, the design would incorporate 64 emitters and detectors rather than 32. This example design describes two emitters and detectors mounted per board, but there is no practical limit to the number of emitters and detectors that may be mounted on a single board. The increased number of emitters and detectors may be used to increase the field of view by adjusting the relative orientation, or may be used to increase the density of points obtained within the same field of view.

Another design feature of the preferred version is the vertical motherboard on which the main electronics that control the firing of the lasers and the capturing of returns are located. As noted above, the motherboard is mounted vertically, defining a plane that is preferably parallel to the central axis 13 (see FIG. 3) about which the system will rotate. While the motherboard is preferably parallel to this axis of rotation, it may be inclined toward a horizontal plane by as much as 30 degrees and still be considered substantially vertical in orientation. The emitter and detector hybrid boards are aligned and soldered directly to this vertical motherboard, thus providing for small overall head size and increased reliability due to the omission of connectors that connect the laser boards with the motherboard. This board is mechanically self-supported, mounted to a frame 22 that fixes it rigidly in position in a vertical orientation so that it spins with the rotating sensor head. The insertion of the hybrid boards can be automated for easy assembly. Prior art sensors exclusively employ motherboard design requiring connectors and cables between the emitters and detectors and the motherboard. The positioning and configuration of the motherboard as shown overcomes these problems.

Another feature of the vertical motherboard design is its proximity inside the sensor head. In order to optimize space, the motherboard is positioned between the mirror and the lenses, as best seen in FIG. 9. Thus, as shown, the sensor head includes one or more lenses 50, 52 supported within a lens frame 54 positioned at a front side of the sensor head. One or more mirrors 40, 42 are positioned at the opposite side of the sensor head and mounted to the frame 22. In the illustrated version, separate mirrors 40, 42 are used for the emitter and detectors, respectively. Most preferably, the frame 22 is a unitary frame formed from a single piece of material that supports the motherboard and the mirrors.

This configuration allows the hybrid emitters to fire rearward into the first mirror 40, wherein the light then reflects off the mirror and travels through the hole 24 in the motherboard 20, through the lens 50 and so that the emitted light 60 travels out to the target 70. This configuration further increases the net focal length of the light path while retaining small size. Likewise the returning light 62 passes through the detector lens 52, through the hole 24 in the motherboard to the opposite mirror 52 and is reflected into the corresponding detector.

Another benefit of the vertical motherboard design is that it facilitates the goal of balancing the sensor head both statically and dynamically to avoid shimmy and vibration during operation. Most preferably, the various components are positioned to allow a near-balanced condition upon initial assembly that requires a minimum of final static and dynamic balancing counterweights. As best seen in FIG. 9, this balancing is obtained by positioning major portions of components about the circumference of the sensor head. More specifically, the lenses and frame are on one side while the mirrors and a generally T-shaped portion of the frame is diametrically opposite the lenses, with the mirrors and rearward portion of the frame configured to have a weight that is about equal to that of the lenses and lens frame. Likewise, the emitter and detector hybrids are carried on diametrically opposite sides of the sensor head, positioned at about a 90 degree offset with respect to the lens and mirror diameter. The motherboard is nearly along a diameter, positioned to counter balance the weight of the other components, such that the center of gravity is at the center of rotation defined by the center of the base 80.

When the present invention is incorporated into an autonomous navigation or mobile mapping vehicle, GPS and inertial sensors are often included to locate the vehicle in space and correct for normal vehicle motion. Inertial sensors often include gyros, such as fiber optic gyros (FOG), and accelerometers. In one embodiment, there is a 6-axis inertial sensor system mounted in the LiDAR base and the signals from the gyros and accelerometers are output along with the LiDAR distance and intensity data.

The separate location of emitters' and detectors' optical paths can create a parallax problem. When the emitters and detectors are separated by a finite distance there always exists a “blind” region nearest to the sensor in which objects cannot be illuminated or detected. Likewise, at long range the emitter's laser light becomes misaligned with its corresponding detector and creates a similar blind spot. The parallax problem is best seen with reference to FIG. 10. A representative emitter 170 transmits a light signal through a lens 172, with the propagated light signal traveling outward and toward a target in the distance. Light reflected from a target may return through a second lens 162 and onward toward a detector 160. The nonparallel orientation of the emitter and detector, however, creates nonparallel light emitter and detector paths. Consequently, there is a near blind spot 180 adjacent the system and a far blind spot 184 more distant from the system. In either of the two blind spots, light reflecting from an object will return along a path that cannot be received by the detector. The near blind spot extends for a distance “A” in front of the system, while the far blind spot extends in the region of distance “C” beyond the system. Between the two blind spots, in a distance defined by “B”, the system will see an object in that light reflected from the object can return along a path that can be detected. Even within region B, however, there is a “sweet spot” 182 defined by the straight line paths of travel from the emitter and to the detector. For the sample embodiment shown in FIGS. 1 and 2 the “sweet spot” 182 for parallax alignment is approximately 100 feet from the centerline of the sensor. Inside of about 10 feet the emitter's light misses its corresponding detector entirely, shown at 180, and beyond approximately 240 feet, shown at 184, the signal becomes weak due to the misalignment of the emitter and detector in the opposite direction.

This effect can be alleviated in one version of the invention by having two “D”-shaped lenses 50, 52 (see FIG. 15), constructed for the emitter and detector, and having these two lenses attached to each other with a minimal gap in between. The close proximity of the conjoint lens system, best seen in FIG. 14, reduces the “blind” region to near zero, as shown by the parallel nature of the emitter's light 60 and detector's light path 62.

Due to the complex nature of the optical propagation in lenses, a lens array is usually needed to correct for various aberrations that are commonly associated with any optical design. For the purpose of constructing a conjoint lens system to overcome the parallax problem described with respect to FIG. 10, it is useful to have the first surface of the lens array being the largest pupil; that is, the optical rays entering the lens system should bend towards the center.

FIG. 11 illustrates a front view of a lens array 50. Though indicated as the emitter lens array, it may also be illustrative of the detector lens array as well. In order to form a D-shaped lens, an edge 51 of the otherwise circular lens is cut away from the lens, removing a left edge 120 of the otherwise circular lens. The resulting lens is somewhat D-shaped, having a vertical left edge. The use of a D-shaped lens array is advantageous in that D-shaped lens arrays for the emitter and detector may be placed back-to-back to form “conjoined” D-shape lens arrays as best seen in FIG. 15. Placing the vertical edges of the D-shapes adjacent one another allows the otherwise circular lenses to be much closer to one another than would be the case if using circular lenses which would only allow for tangential contact between the lens arrays.

The creation of D-shaped lenses and the use of a conjoined pair of D-shaped lens arrays, however, brings a potential signal loss. FIG. 12 illustrates a correct design of the lens array, shown in sectional view taken along lines A-A from FIG. 11. In this illustration the lens array includes a first lens 113, a second lens 111, and a third lens 112. The input rays 100 always bend towards the center in this lens array. Consequently, when a D-shaped cut is made (that is, cutting off a portion of one side of each of the lenses in the area indicated by the shaded region 120), there is no loss of light. As the shaded region indicates, all of the light entering the first lens 113 travels through the entire lens array to the mirror.

FIG. 13 illustrates an incorrect design having a similar array of three lenses 110, 111, 112. In this case, the front lens 110 is differently shaped and some of the input light rays 100 bend away from the center as they travel through the front lens. A cut through the ends of one side of this lens array would result in the loss of some of the light entering the array, as indicated in the shaded region 120 in FIG. 12.

By configuring the lenses in an ideal fashion as illustrated in FIG. 12, a portion of each side of the lens array may be cut in the form of a D-shape. This creates a straight edge along the sides of each lens in the array, allowing the straight sides of the D's forming each lens array to be positioned closely adjacent one another. In this sense, the term “closely adjacent” is understood to mean either in contact with one another or positioned such that the center of the lenses are closer to one another than they could be without the D-shaped cut. As best see in FIG. 15, the two lens arrays 50, 52 are positioned closely adjacent one another with the straight sides back-to-back to form conjoined D-shaped lens arrays. As described above, a first lens array 50 serves as the emitter lens array while the adjacent second lens array 52 serves as the detector lens array.

FIG. 14 illustrates an advantage of the conjoint D-shaped lens design, particularly in how it overcomes the parallax problem illustrated in FIG. 10. In this case, light emerging from the emitter 170 is directed to a first D-shaped lens 50. Most preferably, the emitter is oriented to direct its light path toward a position just inward of the straight side edge of the D-shape. Because of the lens array configuration of the type described in FIG. 12, the light emerges from the first lens 50 in a straight line 60 that can be directed radially away from the sensor head. Likewise, light reflected from the distant object will return along a return path 62 that is parallel to the emitter light path. The closely parallel return path will travel through the second, adjacent conjoined D lens array 52, entering the lens array at a position just inward of the straight side edge of the D-shape, where it is then directed to the detector 160. Consequently, there is no blind spot as with conventional lenses and the parallax problem is resolved.

Another unique design consideration for the preferred implementation addresses the need to transfer power and signal up to the head, and receive signal and offer grounding down from the head. Off the shelf mercury-based rotary couplers are too unreliable and too big for this problem. In one embodiment, shown in FIG. 16, the use of a rotary transformer 145 enables sending power up to the head, and the use of a capacitive coupler 140 down from the head to accommodate these requirements. A phase modulation scheme allows for communication to the head from the base using serial commands in order to instruct the head to limit horizontal field of view, fire all lasers at full power, update its firmware, and other commands.

It is also desired to have the distance returns of the LiDAR scanner be as accurate as possible and be free of spurious images or returns. Firing multiple lasers at once can create a crosstalk condition where the light emitted from one laser inadvertently is detected by the detector of another laser, thus giving a false return. Thus, with reference to FIG. 17, if emitters E1 through E4 all fire at once, their returns would be intended to be received by emitters D1 through D4. But depending on the positioning and configuration of the object from which the light returns, light from one of the emitters may be directed to the wrong detector. For example, as indicated in FIG. 17, light from emitter E1 may end up directed to detector D3, as indicated by the dotted line return path. This would be an invalid return, and the system would erroneously associate it with light sent from emitter E3, thereby creating a faulty pixel in the point cloud.

A similar error can occur if adjacent lasers are fired in a sequential fashion. Thus, with reference to FIG. 16, firing a single emitter E1 may result in light being detected at detector D2 rather than D1. This may most commonly occur when light from emitter E1 travels beyond the true range of the sensor but is reflected from a particularly reflective object, such as a stop sign covered with reflective paint. The firing of adjacent emitters in order makes this form of cross-talk more likely.

In accordance with a preferred version of the invention, the emitters are fired in a non-adjacent single laser firing order. This means that only one emitter detector pair is active at any given time, and at no time do adjacent emitters and detectors fire in sequence. Most preferably there is as much distance as possible between the emitters that are fired in order. Thus, if there are 32 emitters in a vertical stack, the emitters would be assigned labels E1 representing the top-most emitter and then sequentially numbered through E32 representing the bottom emitter in the stack. Emitter E1 (at the top) would be fired first, followed by emitter E17 (in the middle of the stack), then E2, E18, E3, E19, and so on, ending with E16 and E32 before starting over again at the beginning This pattern begins with the top emitter and the middle emitter, dividing the stack into two groups. It then alternates firing one from each group, moving from the top of each half-stack and proceeding sequentially down each half-stack of emitters in an this alternating fashion and then repeating. This pattern ensures the largest possible distance between fired lasers, thereby reducing the chance of crosstalk.

While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.

Hall, David S.

Patent Priority Assignee Title
11879999, Jun 13 2018 HESAI TECHNOLOGY CO., LTD. Lidar systems and methods
11933967, Aug 22 2019 Red Creamery, LLC Distally actuated scanning mirror
Patent Priority Assignee Title
10003168, Oct 18 2017 LUMINAR TECHNOLOGIES, INC Fiber laser with free-space components
10018726, Mar 19 2016 VELODYNE LIDAR USA, INC Integrated illumination and detection for LIDAR based 3-D imaging
10048374, Mar 21 2016 VELODYNE LIDAR USA, INC LIDAR based 3-D imaging with varying pulse repetition
10094925, Mar 31 2017 LUMINAR TECHNOLOGIES, INC Multispectral lidar system
10109183, Dec 30 2016 ZOOX, INC Interface for transferring data between a non-rotating body and a rotating body
10120079, Mar 25 2015 Waymo LLC Vehicle with multiple light detection and ranging devices (LIDARS)
10126412, Aug 19 2013 QUANERGY SOLUTIONS, INC Optical phased array lidar system and method of using same
10132928, May 09 2013 QUANERGY SOLUTIONS, INC Solid state optical phased array lidar and method of using same
10309213, Dec 28 2015 Halliburton Energy Services, Inc Distributed optical sensing using compressive sampling
10330780, Mar 20 2017 VELODYNE LIDAR USA, INC LIDAR based 3-D imaging with structured light and integrated illumination and detection
10386465, Mar 31 2017 VELODYNE LIDAR USA, INC Integrated LIDAR illumination power control
10393874, Jul 02 2014 Robert Bosch GmbH Distance measuring device
10393877, Jun 01 2016 VELODYNE LIDAR USA, INC Multiple pixel scanning LIDAR
10436904, Apr 15 2015 The Boeing Company Systems and methods for modular LADAR scanning
10545222, May 08 2017 VELODYNE LIDAR USA, INC LIDAR data acquisition and control
10613203, Jul 01 2019 VELODYNE LIDAR USA, INC Interference mitigation for light detection and ranging
10627490, Jan 31 2016 VELODYNE LIDAR USA, INC Multiple pulse, LIDAR based 3-D imaging
10627491, Mar 31 2017 VELODYNE LIDAR USA, INC Integrated LIDAR illumination power control
10712434, Sep 18 2018 VELODYNE LIDAR USA, INC Multi-channel LIDAR illumination driver
3064252,
3373441,
3551845,
3636250,
3686514,
3781111,
3862415,
3897150,
3921081,
4179216, May 31 1977 Franz Plasser Bahnbaumaschinen-Industriegesellschaft m.b.H. Apparatus for measuring the profile of a railroad tunnel
4199697, Jul 05 1978 Northern Telecom Limited Pulse amplitude modulation sampling gate including filtering
4201442, Oct 02 1978 Sperry Corporation Liquid crystal switching coupler matrix
4212534, Sep 30 1977 Siemens Aktiengesellschaft Device for contact-free measuring of the distance of a surface of an object from a reference plane
4220103, Aug 10 1978 Aisin Seiki Kabushiki Kaisha Auxiliary table for sewing machines of a free arm type
4477184, Jan 19 1979 Nissan Motor Company, Limited Obstacle detection system for use in vehicles
4516837, Feb 22 1983 Sperry Corporation Electro-optical switch for unpolarized optical signals
4634272, Jun 02 1982 Nissan Motor Company, Limited Optical radar system with an array of photoelectric sensors
4656462, Apr 25 1984 Matsushita Electric Works, Ltd. Object detecting apparatus including photosensors for restricted detection area
4681433, Jul 20 1978 Kern & Co. AG. Method and apparatus for measuring relative position
4700301, Mar 12 1981 Method of automatically steering agricultural type vehicles
4730932, Jan 31 1986 Kabushiki Kaisha Toshiba; Toyo Glass Co., Ltd. Transmissivity inspection apparatus
4742337, Aug 28 1985 Telenot Electronic GmbH Light-curtain area security system
4834531, Oct 31 1985 Energy Optics, Incorporated Dead reckoning optoelectronic intelligent docking system
4862257, Jul 07 1988 Kaman Aerospace Corporation Imaging lidar system
4895440, Aug 22 1988 Trimble Navigation Limited Laser-based measurement system
4896343, May 02 1988 Radiation apparatus with distance mapper for dose control
4902126, Feb 09 1988 FIBERTEK, INC Wire obstacle avoidance system for helicopters
4944036, Dec 28 1970 Signature filter system
4952911, May 18 1988 Eastman Kodak Company Scanning intrusion detection device
4967183, May 18 1988 Eastman Kodak Company Method of intrusion detection over a wide area
5004916, Jul 28 1989 Intel Corporation Scanning system having automatic laser shutdown upon detection of defective scanning element motion
5006721, Mar 23 1990 PERCEPTRON, INC Lidar scanning system
5023888, Jul 24 1972 Lockheed Martin Corporation Pulse code recognition method and system
5026156, Jul 24 1972 Lockheed Martin Corporation Method and system for pulse interval modulation
5033819, Feb 10 1989 Asahi Kogaku Kogyo Kabushiki Kaisha Light intercepting device in lens barrel
5059008, Mar 26 1990 Lockheed Martin Corporation Wide angle beam steerer using translation of plural lens arrays
5175694, Feb 08 1990 The United States of America as represented by the Secretary of the Navy Centroid target tracking system utilizing parallel processing of digital data patterns
5177768, Nov 22 1991 Telcordia Technologies, Inc Spread-time code division multiple access technique with arbitrary spectral shaping
5210586, Jun 27 1990 Siemens Aktiengesellschaft Arrangement for recognizing obstacles for pilots of low-flying aircraft
5212533, Nov 14 1990 Kabushiki Kaisha Topcon Light wave distance meter
5241481, Jun 22 1987 Arnex Handelsbolag Method and a device for laser optical navigation
5249157, Aug 22 1990 KOLLMORGEN CORPORATION Collision avoidance system
5291261, Feb 06 1990 MOTOROLA, INC , A CORP OF DELAWARE Optical object detection system incorporating fiber optic coupling
5309212, Sep 04 1992 Yaskawa Electric Corporation Scanning rangefinder with range to frequency conversion
5314037, Jan 22 1993 Automobile collision avoidance system
5319201, Feb 12 1991 PROXIMETER COMPANY LIMITED, THE Proximity detector
5357331, Jul 02 1991 Lockheed Martin Corp System for processing reflected energy signals
5365218, Sep 14 1991 Deutsche Aerospace AG System for guarding property including a mobile laser unit
5463384, Feb 11 1991 AUTOSENSE LLC Collision avoidance system for vehicles
5465142, Apr 30 1993 Northrop Grumman Systems Corporation Obstacle avoidance system for helicopters and other aircraft
5515156, Jul 29 1993 OMRON AUTOMOTIVE ELECTRONICS CO , LTD Electromagentic wave generating device and a distance measuring device
5546188, Nov 23 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Intelligent vehicle highway system sensor and method
5563706, Aug 24 1993 Nikon Corporation Interferometric surface profiler with an alignment optical member
5572219, Jul 07 1995 General Electric Company Method and apparatus for remotely calibrating a phased array system used for satellite communication
5691687, Jul 03 1995 ADMINSTRATOR OF THE NATIONAL AERONAUTICS AND SPACE ADMINISTRATION Contactless magnetic slip ring
5710417, Oct 21 1988 Symbol Technologies, LLC Bar code reader for reading both one dimensional and two dimensional symbologies with programmable resolution
5757472, Nov 23 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Intelligent vehicle highway system sensor and method
5757501, Aug 17 1995 Apparatus for optically sensing obstacles in front of vehicles
5757677, Sep 08 1994 Trimble Navigation Limited Compensation for differences in receiver signals and in satellite signals
5789739, Oct 26 1995 Sick AG Optical detection device for determining the position of an indicator medium
5793163, Sep 29 1995 Pioneer Electronic Corporation Driving circuit for light emitting element
5793491, Dec 30 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Intelligent vehicle highway system multi-lane sensor and method
5805468, May 09 1995 Sick AG Method and apparatus for determining the light transit time over a measurement path arranged between a measuring apparatus and a reflecting object
5847817, Jan 14 1997 McDonnell Douglas Corporation Method for extending range and sensitivity of a fiber optic micro-doppler ladar system and apparatus therefor
5877688, Apr 12 1995 MATSUSHITA ELECTRIC INDUSTRIAL CO , LTD Thermal object measuring apparatus
5889479, Mar 02 1994 VIA TECHNOLOGIES,INC ; IP-First, LLC Apparatus for guiding the pilot of an aircraft approaching its parking position
5895984, Dec 13 1995 Leica Geosystems AG Circuit arrangement for feeding a pulse output stage
5903355, Mar 31 1994 Sick AG Method and apparatus for checking a predetermined monitoring area
5903386, Jan 20 1998 Northrop Grumman Systems Corporation Tilted primary clamshell lens laser scanner
5923910, Feb 22 1995 Asahi Kogaku Kogyo Kabushiki Kaisha Distance measuring apparatus
5942688, Nov 18 1994 Mitsubishi Denki Kabushiki Kaisha Apparatus and method for detecting a measurable quantity of an object
5949530, Feb 27 1996 Sick AG Laser range finding apparatus
5953110, Apr 23 1998 H.N. Burns Engineering Corporation Multichannel laser radar
5991011, Nov 14 1996 Sick AG Laser distance finding apparatus
6034803, Apr 30 1997 QUANTAPOINT, INC Method and apparatus for directing energy based range detection sensor
6043868, Aug 23 1996 KAMA-TECH HK LIMITED Distance measurement and ranging instrument having a light emitting diode-based transmitter
6069565, Oct 20 1992 Rosemount Aerospace Inc System for detecting ice or snow on surface which specularly reflects light
6088085, Feb 05 1997 Sick AG Range measurement apparatus
6091071, Apr 18 1996 Sick AG Opto-electronic sensor
6100539, Jan 20 1997 Sick AG Light sensor with evaluation of the light transit time
6137566, Feb 24 1999 EOO, INC Method and apparatus for signal processing in a laser radar receiver
6153878, Aug 17 1998 Sick AG Device for locating objects penetrating into a region of space to be monitored
6157294, Dec 27 1997 Honda Giken Kogyo Kabushiki Kaisha Vehicle obstacle detecting system
6201236, Nov 13 1997 AUTOSENSE LLC Detection system with improved noise tolerance
6259714, Sep 09 1997 Mitsubishi Denki Kabushiki Kaisha Power source control apparatus for laser diode
6297844, Nov 24 1999 Cognex Corporation Video safety curtain
6321172, Feb 12 1998 Sick AG Method for configuring sensors
6327806, Sep 25 1996 OPTICS RESEARCH HK LTD ; LEUPOLD & STEVENS, INC Optical sighting devices
6329800, Oct 17 2000 SIGMATEL, LLC Method and apparatus for reducing power consumption in driver circuits
6335789, Feb 25 1998 Honda Giken Kogyo Kabushiki Kaisha Optical radar system
6365429, Dec 30 1998 SAMSUNG ELECTRONICS CO , LTD Method for nitride based laser diode with growth substrate removed using an intermediate substrate
6396577, Mar 19 2001 Lidar-based air defense system
6420698, Apr 24 1997 Leica Geosystems AG Integrated system for quickly and accurately imaging and modeling three-dimensional objects
6441363, Feb 24 1999 Siemens VDO Automotive Corporation Vehicle occupant sensing system
6441889, Nov 29 2000 KUSTOM SIGNALS, INC LIDAR with increased emitted laser power
6442476, Apr 15 1998 COMMONWEALTH SCIENTIFIC AND INSUSTRIAL RESEARCH ORGANISATION; Research Organisation Method of tracking and sensing position of objects
6473079, Apr 24 1996 Leica Geosystems AG Integrated system for quickly and accurately imaging and modeling three-dimensional objects
6504712, Jun 01 1999 Showa Denka K.K. Heat sinks for CPUs for use in personal computers
6509958, May 31 2000 Sick AG Method for distance measurement and a distance measuring device
6593582, May 11 2001 SCIENCE & ENGINEERING SERVICES, INC Portable digital lidar system
6621764, Apr 30 1997 Weapon location by acoustic-optic sensor fusion
6636300, Mar 18 1999 Siemens Aktiengesellschaft Spatially resolving range-finding system
6646725, Jul 11 2001 Iowa Research Foundation Multiple beam lidar system for wind measurement
6650402, Feb 10 2000 Oceanit Laboratories, Inc. Omni-directional cloud height indicator
6664529, Jul 19 2000 Utah State University 3D multispectral lidar
6665063, Sep 04 2001 Rosemount Aerospace Inc. Distributed laser obstacle awareness system
6670905, Jun 14 1999 AMERICAN CAPITAL FINANCIAL SERVICES, INC , AS SUCCESSOR ADMINISTRATIVE AGENT Radar warning receiver with position and velocity sensitive functions
6682478, Feb 08 2001 Olympus Corporation Endoscope apparatus with an insertion part having a small outer diameter which includes and object optical system
6687373, Aug 24 1999 Nortel Networks Limited Heusristics for optimum beta factor and filter order determination in echo canceler systems
6710324, Oct 29 2001 Sick AG Optoelectronic distance measuring device
6742707, Jun 07 2000 METROLOGIC INSTRUMENTS, INC , A CORPORATION OF NEW JERSEY METHOD OF SPECKLE-NOISE PATTERN REDUCTION AND APPARATUS THEREFOR BASED ON REDUCING THE SPATIAL-COHERENCE OF THE PLANAR LASER ILLUMINATION BEAM BEFORE THE BEAM ILLUMINATES THE TARGET OBJECT BY APPLYING SPATIAL PHASE SHIFTING TECHNIQUES DURING THE TRANSMISSION OF THE PLIB THERETOWARDS
6747747, Mar 05 2001 Sick AG Apparatus for determining a distance profile
6759649, Sep 03 2001 Sick AG Optoelectronic detection device
6789527, Sep 04 2000 Robert Bosch GmbH Method for adaptively controlling knocking of a gasoline direct fuel injection internal combustion engine, and a corresponding device
6812450, Mar 05 2002 Sick AG Method and an apparatus for monitoring a protected zone
6876790, May 17 2002 Science & Engineering Services, Inc. Method of coupling a laser signal to an optical carrier
6879419, Dec 05 2002 Northrop Grumman Systems Corporation Laser scanner with peripheral scanning capability
6969558, Oct 13 1992 General Electric Company Low sulfur article having a platinum-aluminide protective layer, and its preparation
7030968, Nov 24 2000 Trimble AB Device for the three-dimensional recording of a scene using laser emission
7041962, Jul 05 2002 Sick AG Laser scanning apparatus
7089114, Jul 03 2003 Vehicle collision avoidance system and method
7106424, Mar 11 2003 Rosemount Aerospace Inc. Compact laser altimeter system
7129971, Feb 16 2000 IMMERSIVE LICENSING, INC Rotating scan self-cleaning camera
7130672, Sep 25 2001 CRITISENSE LTD Apparatus and method for monitoring tissue vitality parameters
7131586, Jun 07 2000 Metrologic Instruments, Inc. Method of and apparatus for reducing speckle-pattern noise in a planar laser illumination and imaging (PLIIM) based system
7190465, Aug 30 2001 Z + F Zoller & Froehlich GmbH Laser measurement system
7240314, Jun 04 2004 Synopsys, Inc Redundantly tied metal fill for IR-drop and layout density optimization
7248342, Feb 14 2003 United States of America as represented by the Administrator of the National Aeronautics and Space Administration Three-dimension imaging lidar
7281891, Feb 28 2003 Qinetiq Limited Wind turbine control having a lidar wind speed measurement apparatus
7295298, Jun 05 2001 Sick AG Detection method and a detection apparatus
7313424, Mar 20 2002 CRITISENSE LTD Diagnosis of body metabolic emergency state
7315377, Feb 10 2003 University of Virginia Patent Foundation System and method for remote sensing and/or analyzing spectral properties of targets and/or chemical species for detection and identification thereof
7319777, Apr 04 2001 Instro Precision Limited Image analysis apparatus
7345271, Sep 25 2002 Sick AG Optoelectric sensing device with common deflection device
7358819, Jan 17 2006 ROCKWELL AUTOMATION TECHNOLOGIES, INC Reduced-size sensor circuit
7373473, Mar 10 2004 Leica Geosystems AG System and method for efficient storage and manipulation of extremely large amounts of scan data
7408462, Sep 16 2004 Sick AG Control of monitored zone
7477360, Feb 11 2005 UATC, LLC Method and apparatus for displaying a 2D image data set combined with a 3D rangefinder data set
7480031, Jun 10 2006 Sick AG Scanner
7544945, Feb 06 2006 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Vertical cavity surface emitting laser (VCSEL) array laser scanner
7570793, Jun 15 2001 Ibeo Automotive Systems GmbH Correction method for data of a plurality of optoelectronic sensors
7583364, Mar 19 2004 University Corporation for Atmospheric Research High pulse-energy, eye-safe lidar system
7589826, Dec 20 2006 Sick AG Laser scanner
7619477, Jan 18 2006 Infineon Technologies Americas Corp Current sense amplifier for voltage converter
7623222, Dec 18 2004 Leica Geosystems AG Single-channel heterodyne distance-measuring method
7640068, Jul 03 2006 Trimble AB Surveying instrument and method of controlling a surveying instrument
7642946, Apr 07 2008 AVAGO TECHNOLOGIES INTERNATIONAL SALES PTE LIMITED Successive approximation analog to digital converter
7684590, Apr 19 2004 Sick AG Method of recognizing and/or tracking objects
7697581, Mar 16 2004 Leica Geosystems AG Laser operation for survey instruments
7741618, Nov 19 2004 SCIENCE & ENGINEERING SERVICES, INC Enhanced portable digital lidar system
7746271, Aug 28 2006 Sick AG Method for determining the global position
7868665, Mar 05 2002 NOVA R & D, INC Integrated circuit and sensor for imaging
7944548, Mar 07 2006 Leica Geosystems AG Increasing measurement rate in time of flight measurement apparatuses
7969558, Jul 13 2006 VELODYNE LIDAR USA, INC High definition lidar system
8042056, Mar 16 2004 Leica Geosystems AG Browsers for large geometric data visualization
8072582, Aug 19 2008 Rosemount Aerospace Inc. Lidar system using a pseudo-random pulse sequence
8077047, Apr 16 2009 UT-Battelle, LLC Tampering detection system using quantum-mechanical systems
8139685, May 10 2005 Qualcomm Incorporated Systems, methods, and apparatus for frequency control
8203702, Jun 13 2005 Arete Associates Optical system
8274037, Jan 27 2010 INTERSIL AMERICAS LLC Automatic calibration technique for time of flight (TOF) transceivers
8310653, Dec 25 2008 Kabushiki Kaisha Topcon Laser scanner, laser scanner measuring system, calibration method for laser scanner measuring system and target for calibration
8451432, Jun 09 2005 Analog Modules, Inc Laser spot tracking with off-axis angle detection
8605262, Jun 23 2010 The United States of America as represented by the Administrator of the National Aeronautics and Space Administration Time shifted PN codes for CW LiDAR, radar, and sonar
8675181, Jun 02 2009 VELODYNE LIDAR USA, INC Color LiDAR scanner
8736818, Aug 16 2010 BAE SYSTEMS SPACE & MISSION SYSTEMS INC Electronically steered flash LIDAR
8767190, Jul 13 2006 VELODYNE LIDAR USA, INC High definition LiDAR system
8875409, Jan 20 2010 FARO TECHNOLOGIES, INC Coordinate measurement machines with removable accessories
8976340, Apr 15 2011 Continental Autonomous Mobility US, LLC Ladar sensor for landing, docking and approach
8995478, Apr 08 2014 TEKHNOSCAN - LAB LLC Passively mode-locked pulsed fiber laser
9059562, Jun 23 2011 DAYLIGHT SOLUTIONS, INC Control system for directing power to a laser assembly
9063549, Mar 06 2013 Waymo LLC Light detection and ranging device with oscillating mirror driven by magnetically interactive coil
9069061, Jul 19 2011 BAE SYSTEMS SPACE & MISSION SYSTEMS INC LIDAR with analog memory
9069080, May 24 2013 Continental Autonomous Mobility US, LLC Automotive auxiliary ladar sensor
9086273, Mar 08 2013 GOOGLE LLC Microrod compression of laser beam in combination with transmit lens
9093969, Aug 15 2012 Skyworks Solutions, Inc Systems, circuits and methods related to controllers for radio-frequency power amplifiers
9110154, Feb 19 2014 Raytheon Company Portable programmable ladar test target
9151940, Dec 05 2012 KLA-Tencor Corporation Semiconductor inspection and metrology system using laser pulse multiplier
9191260, Mar 08 2013 LIGHTWORKS II, LLC Method and apparatus to determine a match between signals
9194701, Dec 23 2011 Leica Geosystems AG Distance-measuring device alignment
9239959, Apr 08 2013 Lockheed Martin Corporation Multi-resolution, wide field-of-view, unmanned ground vehicle navigation sensor
9246041, Apr 26 2012 ID QUANTIQUE SA Apparatus and method for allowing avalanche photodiode based single-photon detectors to be driven by the same electrical circuit in gated and in free-running modes
9250327, Mar 05 2013 SubCarrier Systems Corporation Method and apparatus for reducing satellite position message payload by adaptive data compression techniques
9285477, Jan 25 2013 Apple Inc. 3D depth point cloud from timing flight of 2D scanned light beam pulses
9286538, May 01 2014 HRL Laboratories, LLC Adaptive 3D to 2D projection for different height slices and extraction of robust morphological features for 3D object recognition
9310197, May 26 2011 Hilti Aktiengesellschaft Measuring device for measuring distance
9383753, Sep 26 2012 GOOGLE LLC Wide-view LIDAR with areas of special attention
9453914, Sep 09 2011 Continental Autonomous Mobility US, LLC Terrain mapping LADAR system
9529079, Mar 26 2015 GOOGLE LLC Multiplexed multichannel photodetector
9772607, Aug 23 2013 SICPA HOLDING SA Method and system for authenticating a device
9964632, Mar 26 2015 Waymo LLC Multiplexed multichannel photodetector
9983297, Mar 21 2016 VELODYNE LIDAR USA, INC LIDAR based 3-D imaging with varying illumination field density
9989629, Mar 30 2017 LUMINAR TECHNOLOGIES, INC Cross-talk mitigation using wavelength switching
20010011289,
20010017718,
20020003617,
20020060784,
20020117545,
20030041079,
20030043363,
20030043364,
20030057533,
20030066977,
20030076485,
20030090646,
20030163030,
20040021852,
20040066500,
20040134879,
20040150810,
20040213463,
20040240706,
20040240710,
20040247157,
20050023353,
20050168720,
20050211893,
20050232466,
20050246065,
20050248749,
20050279914,
20060007350,
20060089765,
20060100783,
20060115113,
20060132635,
20060176697,
20060186326,
20060197867,
20060231771,
20060290920,
20070035624,
20070071056,
20070121095,
20070181810,
20070201027,
20070219720,
20070241955,
20070272841,
20080002176,
20080013896,
20080074640,
20080079371,
20080154495,
20080170826,
20080186501,
20080302971,
20090010644,
20090026503,
20090085901,
20090122295,
20090142053,
20090168045,
20090218475,
20090245788,
20090323737,
20100006760,
20100020306,
20100045965,
20100046953,
20100067070,
20100073780,
20100074532,
20100134596,
20100188722,
20100198487,
20100204964,
20100239139,
20100265077,
20100271615,
20100302528,
20110028859,
20110040482,
20110176183,
20110211188,
20110216304,
20110305250,
20120038903,
20120195597,
20120287417,
20130024176,
20130038915,
20130050144,
20130050486,
20130070239,
20130093583,
20130094960,
20130151198,
20130168673,
20130206967,
20130241761,
20130242283,
20130258312,
20130286404,
20130300479,
20130314711,
20130336375,
20130342366,
20140063483,
20140071234,
20140078519,
20140104592,
20140176657,
20140240317,
20140240721,
20140253369,
20140259715,
20140267848,
20140274093,
20140347650,
20150015895,
20150035437,
20150055117,
20150101234,
20150116695,
20150131080,
20150144806,
20150185325,
20150202939,
20150219764,
20150219765,
20150226853,
20150293224,
20150293228,
20150303216,
20160003946,
20160009410,
20160014309,
20160021713,
20160049058,
20160098620,
20160117431,
20160154105,
20160161600,
20160191173,
20160209499,
20160245919,
20160259038,
20160279808,
20160300484,
20160306032,
20160313445,
20160363659,
20160365846,
20170146639,
20170146640,
20170153319,
20170214861,
20170219695,
20170220876,
20170242102,
20170269198,
20170269209,
20170269215,
20170299721,
20170350983,
20180019155,
20180058197,
20180059219,
20180074382,
20180100924,
20180106902,
20180168539,
20180267151,
20180275249,
20180284227,
20180284274,
20180321360,
20180364098,
20190001442,
20190011563,
20190178991,
20190339365,
20190361092,
20190369257,
20190369258,
20200025896,
20200064452,
20200142070,
20200144971,
20200166613,
20200191915,
CA2089105,
CH641583,
CN103278808,
CN106443699,
CN106597471,
CN1106534,
CN1576123,
CN206773192,
CN208902906,
CN2681085,
CN2773714,
DE10025511,
DE10110420,
DE10114362,
DE10127417,
DE10128954,
DE10141055,
DE10143060,
DE10146692,
DE10148070,
DE10151983,
DE10162668,
DE102004010197,
DE102004014041,
DE102005003827,
DE102005019233,
DE102005050824,
DE102007013023,
DE10217295,
DE10222797,
DE10229408,
DE10244638,
DE10244640,
DE10244643,
DE10258794,
DE10303015,
DE10331529,
DE10341548,
DE19512644,
DE19512681,
DE19717399,
DE19727792,
DE19741730,
DE19741731,
DE19752145,
DE19757840,
DE19757847,
DE19757848,
DE19757849,
DE19815149,
DE19828000,
DE19902903,
DE19911375,
DE19919925,
DE19927501,
DE19936440,
DE19953006,
DE19953007,
DE19953009,
DE19953010,
DE202015009250,
DE3134815,
DE3216312,
DE3216313,
DE3701340,
DE3741259,
DE3808972,
DE3821892,
DE4040894,
DE4115747,
DE4124192,
DE4127168,
DE4137550,
DE4215272,
DE4243631,
DE4340756,
DE4345446,
DE4345448,
DE4411448,
DE4412044,
DE930909,
EP185816,
EP361188,
EP396865,
EP412395,
EP412398,
EP412399,
EP412400,
EP468175,
EP486430,
EP653720,
EP656868,
EP897120,
EP913707,
EP937996,
EP967492,
EP1046938,
EP1055937,
EP1148345,
EP1160718,
EP1174733,
EP1267177,
EP1267178,
EP1286178,
EP1286181,
EP1288677,
EP1291673,
EP1291674,
EP1298012,
EP1298453,
EP1298454,
EP1300715,
EP1302784,
EP1304583,
EP1306690,
EP1308747,
EP1355128,
EP1403657,
EP1408318,
EP1418444,
EP1460454,
EP1475764,
EP1515157,
EP1531342,
EP1531343,
EP1548351,
EP1557691,
EP1557692,
EP1557693,
EP1557694,
EP1700763,
EP1914564,
EP1927867,
EP1939652,
EP1947377,
EP1983354,
EP2003471,
EP2177931,
EP2503360,
GB2041687,
JP11264871,
JP2001216592,
JP2001256576,
JP2002031528,
JP2003336447,
JP2004348575,
JP2005070840,
JP2005297863,
JP2006177843,
JP36407,
JP5240940,
JP6288725,
RE45854, Jul 03 2006 Faro Technologies, Inc. Method and an apparatus for capturing three-dimensional data of an area of space
RE46672, Jul 13 2006 VELODYNE LIDAR USA, INC High definition LiDAR system
RE47942, Jul 13 2006 VELODYNE LIDAR USA, INC High definition lidar system
WO131608,
WO3019234,
WO3040755,
WO1999003080,
WO2000025089,
WO2004019293,
WO2004036245,
WO2008008970,
WO2009120706,
WO2015079300,
WO2015104572,
WO2016162568,
WO2017033419,
WO2017089063,
WO2017132703,
WO2017164989,
WO2017165316,
WO2017193269,
WO2018125823,
WO2018196001,
//////////
Executed onAssignorAssigneeConveyanceFrameReelDoc
May 17 2011HALL, DAVID S VELODYNE ACOUSTICS, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0550780217 pdf
Dec 31 2015VELODYNE ACOUSTICS, INC VELODYNE ACOUSTICS, LLCCONVERSION0551700774 pdf
Dec 31 2015VELODYNE ACOUSTICS, LLCVELODYNE LIDAR, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0550780443 pdf
Dec 31 2015VELODYNE ACOUSTICS, INC VELODYNE ACOUSTICS, INC MERGER SEE DOCUMENT FOR DETAILS 0550780309 pdf
Sep 11 2017Velodyne LIDAR USA, Inc.(assignment on the face of the patent)
Sep 29 2020VELODYNE LIDAR USA, INC VELODYNE LIDAR USA, INC MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS 0544380260 pdf
Sep 29 2020VELODYNE LIDAR, INC VELODYNE LIDAR USA, INC MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS 0544380260 pdf
Sep 29 2020VL MERGER SUB INC VELODYNE LIDAR USA, INC MERGER AND CHANGE OF NAME SEE DOCUMENT FOR DETAILS 0544380260 pdf
May 09 2023VELODYNE LIDAR USA, INC HERCULES CAPITAL, INC , AS AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0635930463 pdf
Oct 25 2023HERCULES CAPITAL, INC VELODYNE LIDAR USA, INC RELEASE OF INTELLECTUAL PROPERTY SECURITY AGREEMENT RECORDED AT REEL FRAME NO 063593 04630653500801 pdf
Date Maintenance Fee Events
Sep 11 2017BIG: Entity status set to Undiscounted (note the period is included in the code).
Sep 14 2017SMAL: Entity status set to Small.
Aug 16 2019BIG: Entity status set to Undiscounted (note the period is included in the code).
Feb 22 2022M1552: Payment of Maintenance Fee, 8th Year, Large Entity.
Feb 22 2022M1555: 7.5 yr surcharge - late pmt w/in 6 mo, Large Entity.


Date Maintenance Schedule
Mar 30 20244 years fee payment window open
Sep 30 20246 months grace period start (w surcharge)
Mar 30 2025patent expiry (for year 4)
Mar 30 20272 years to revive unintentionally abandoned end. (for year 4)
Mar 30 20288 years fee payment window open
Sep 30 20286 months grace period start (w surcharge)
Mar 30 2029patent expiry (for year 8)
Mar 30 20312 years to revive unintentionally abandoned end. (for year 8)
Mar 30 203212 years fee payment window open
Sep 30 20326 months grace period start (w surcharge)
Mar 30 2033patent expiry (for year 12)
Mar 30 20352 years to revive unintentionally abandoned end. (for year 12)