A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, comprising: providing a 3d optical emitter; providing a 3d optical receiver with a wide and deep field of view; driving the 3d optical emitter into emitting short light pulses; receiving a reflection/backscatter of the emitted light, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of the 3d optical receiver; using the individual digital full-waveform LIDAR trace and the emitted light waveform, detecting a presence of a plurality of vehicles, a position of at least part of each vehicle and a time at which the position is detected; assigning a unique identifier to each vehicle; repeating the steps of driving, receiving, acquiring and detecting, at a predetermined frequency; tracking and recording an updated position of each vehicle and an updated time at which the updated position is detected.

Patent
   RE48914
Priority
Mar 01 2013
Filed
Jan 11 2018
Issued
Feb 01 2022
Expiry
Mar 01 2033
Assg.orig
Entity
Large
3
353
currently ok
26. A vehicle detection system for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the system comprising:
a 3d optical emitter provided at an installation height and oriented to allow illumination of a 3d detection zonein the environment;
a 3d optical receiver provided and oriented to have a wide and deep field of view within the 3d detection zone, the 3d optical receiver having a plurality of detection channelsin said field of view;
a controller for driving the 3d optical emitter into emitting short light pulses toward the detection zone, the light pulses having an emitted light waveform;
the 3d optical receiver for receiving a reflection/backscatter of the emitted light on the vehicles in the 3d detection zone, thereby for acquiring an individual digital full-waveform light detection and ranging (LIDAR) trace for each channel of the 3d optical receiver;
a processor configured for detecting a presence of a plurality of vehicles in the 3d detection zone using the individual digital full-waveform LIDAR traceand the emitted light waveform, capturing a series of vehicle measurements from the LIDAR trace, detecting a position of at least part of each the vehicle in the 3d detection zone, recording a time at which the position is detected, and assigning a unique identifier to each vehicle of the plurality of vehicles detectedand tracking and recording an updated position of each vehicle of the plurality of vehicles detected and an updated time at which the updated position is detected, with the unique identifier;
a 2d optical receiver, wherein the 2d optical receiver is an image sensor adapted to provide images of the 2d detection zone; and
a driver for driving the 2d optical receiver image sensor to capture a 2d image the images;
the processor being further adapted configured for using image registration to correlate corresponding locations between said 2d image images and said detection channels, estimating a length of a vehicle by fitting a first line to a first subset of said series of vehicle measurements, estimating a width of a vehicle by fitting a second line to a second subset of the vehicle measurements, and extracting vehicle identification data from the 2d image at a location corresponding to the location for the a detected vehicle; and assigning the vehicle identification data to the unique identifier.
0. 1. A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the method comprising:
providing a 3d optical emitter at an installation height oriented to allow illumination of a 3d detection zone in said environment;
providing a 3d optical receiver oriented to have a wide and deep field of view within said 3d detection zone, said 3d optical receiver having a plurality of detection channels in said field of view;
driving the 3d optical emitter into emitting short light pulses toward the detection zone, said light pulses having an emitted light waveform;
receiving a reflection/backscatter of the emitted light on the vehicles in the 3d detection zone at said 3d optical receiver, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of said 3d optical receiver;
using said individual digital full-waveform LIDAR trace and said emitted light waveform, detecting a presence of a plurality of vehicles in said 3d detection zone, a position of at least part of each said vehicle in said 3d detection zone and a time at which said position is detected;
assigning a unique identifier to each vehicle of said plurality of vehicles detected;
repeating said steps of driving, receiving, acquiring and detecting, at a predetermined frequency;
at each instance of said repeating step, tracking and recording an updated position of each vehicle of said plurality of vehicles detected and an updated time at which said updated position is detected, with said unique identifier;
wherein said detecting said presence includes:
extracting observations in the individual digital full-waveform LIDAR trace;
using the location for the observations to remove observations coming from a surrounding environment;
extracting lines using an estimate line and a covariance matrix using polar coordinates;
removing observations located on lines parallel to the x axis.
0. 2. The method as claimed in claim 1, wherein said traffic control environment is at least one of a traffic management environment and a traffic enforcement environment.
0. 3. The method as claimed in claim 1, wherein said detecting said presence includes
extracting observations in the individual digital full-waveform LIDAR trace and intensity data for the observations;
finding at least one blob in the observations;
computing an observation weight depending on the intensity of the observations in the blob;
computing a blob gravity center based on the weight and a position of the observations in the blob.
0. 4. The method as claimed in claim 1, further comprising setting at least one trigger line location and recording trigger line trespassing data with the unique identifier.
0. 5. The method as claimed in claim 4, further comprising setting said trigger line location relative to a visible landmark in said environment.
0. 6. The method as claimed in claim 1, wherein said detecting said time at which said position is detected includes assigning a timestamp for said detecting said presence and wherein said timestamp is adapted to be synchronized with an external controller.
0. 7. The method as claimed in claim 1, further comprising obtaining a classification for each detected vehicles using a plurality of detections in the 3d detection zone caused by the same vehicle.
0. 8. The method as claimed in claim 1, wherein said detecting said presence further comprises detecting a presence of a pedestrian in said environment.
0. 9. The method as claimed in claim 1, wherein said part of said vehicle is one of a front, a side and a rear of the vehicle.
0. 10. The method as claimed in claim 1, wherein emitting short light pulses includes emitting short light pulses of a duration of less than 50 ns.
0. 11. The method as claimed in claim 1, wherein said 3d optical emitter is at least one of an infrared LED source, a visible-light LED source and a laser.
0. 12. The method as claimed in claim 1, wherein said providing said 3d optical receiver to have a wide and deep field of view includes providing said 3d optical receiver to have a horizontal field of view angle of at least 20° and a vertical field of view angle of at least 4°.
0. 13. The method as claimed in claim 1, further comprising determining and recording a speed for each said vehicle using said position and said updated position of one of said instances of said repeating step and an elapsed time between said time of said position and said updated time of said updated position, with said unique identifier.
0. 14. The method as claimed in claim 13, further comprising using a Kalman filter to determine an accuracy for said speed to validate said speed; comparing said accuracy to a predetermined accuracy threshold; if said accuracy is lower than said predetermined accuracy threshold, rejecting said speed.
0. 15. The method as claimed in claim 14, further comprising retrieving a speed limit and identifying a speed limit infraction by comparing said speed recorded for each said vehicle to said speed limit.
0. 16. The method as claimed in claim 1, further comprising:
providing a 2d optical receiver, wherein said 2d optical receiver being an image sensor adapted to provide images of said 2d detection zone;
driving the 2d optical receiver to capture a 2d image;
using image registration to correlate corresponding locations between said 2d image and said detection channels;
extracting vehicle identification data from said 2d image at a location corresponding to said location for said detected vehicle;
assigning said vehicle identification data to said unique identifier.
0. 17. The method as claimed in claim 16, wherein the vehicle identification data is at least one of a picture of the vehicle and a license plate alphanumerical code present on the vehicle.
0. 18. The method as claimed in claim 17, wherein the vehicle identification data includes said 2d image showing a traffic violation.
0. 19. The method as claimed in claim 17, further comprising extracting at least one of a size of characters on the license plate and a size of the license plate and comparing one of said size among different instances of the repeating to determine an approximate speed value.
0. 20. The method as claimed in claim 16, further comprising providing a 2d illumination source oriented to allow illumination of a 2d detection zone in said 3d detection zone and driving the 2d illumination source to emit pulses to illuminate said 2d detection zone and synchronizing said driving the 2d optical receiver to capture images with said driving the 2d illumination source to emit pulses to allow capture of said images during said illumination.
0. 21. The method as claimed in claim 20, wherein driving the 2d illumination source includes driving the 2d illumination source to emit pulses of a duration between 10 μs and 10 ms.
0. 22. The method as claimed in claim 19, wherein the 2d illumination source is at least one of a visible light LED source, an infrared LED light source and laser.
0. 23. The method as claimed in claim 19, wherein the 3d optical emitter and the 2d illumination source are provided by a common infrared LED light source.
0. 24. The method as claimed in claim 19, wherein the vehicle identification data is at least two areas of high retroreflectivity apparent on the images, said detecting a presence includes extracting observations in the individual digital signals and intensity data for the observations, the method further comprising correlating locations for the areas of high retroreflectivity and high intensity data locations in the observations, wherein each said area of high retroreflectivity is created from one of a retroreflective license plate, a retro-reflector affixed on a vehicle and a retro-reflective lighting module provided on a vehicle.
0. 25. The method as claimed in claim 16, further comprising combining multiples ones of said captured images into a combined image with the vehicle and the vehicle identification data apparent.
27. The system as claimed in claim 26, wherein said processor is further for determining and recording a speed for each the vehicle using the position and the an updated position of one of the instances of the repeating step and an elapsed time between the time of the position and the an updated time of the updated position, with the unique identifier.
28. The system as claimed in claim 26, further comprising a 2d illumination source provided and oriented to allow illumination of a 2d detection zone in the 3d detection zone;
a source driver for driving the 2d illumination source to emit pulses;
a synchronization module for synchronizing said source driver and said driver to allow capture of said images while said 2d detection zone is illuminated.
0. 29. A method for tracking and characterizing a plurality of vehicles simultaneously in a traffic control environment, the method comprising:
providing a 3d optical emitter at an installation height oriented to allow illumination of a 3d detection zone in said environment;
providing a 3d optical receiver oriented to have a wide and deep field of view within said 3d detection zone, said 3d optical receiver having a plurality of detection channels in said field of view;
driving the 3d optical emitter into emitting short light pulses toward the detection zone, said light pulses having an emitted light waveform;
receiving a reflection/backscatter of the emitted light on the vehicles in the 3d detection zone at said 3d optical receiver, thereby acquiring an individual digital full-waveform LIDAR trace for each detection channel of said 3d optical receiver;
using said individual digital full-waveform LIDAR trace and said emitted light waveform, detecting a presence of a plurality of vehicles in said 3d detection zone, a position of at least part of each said vehicle in said 3d detection zone and a time at which said position is detected;
assigning a unique identifier to each vehicle of said plurality of vehicles detected;
repeating said steps of driving, receiving, acquiring and detecting, at a predetermined frequency;
at each instance of said repeating step, tracking and recording an updated position of each vehicle of said plurality of vehicles detected and an updated time at which said updated position is detected, with said unique identifier,
wherein said detecting said presence includes:
extracting observations in the individual digital full-waveform LIDAR trace and intensity data for the observations;
finding at least one blob in the observations;
computing an observation weight depending on the intensity of the observations in the blob;
computing a blob gravity center based on the weight and a position of the observations in the blob.
0. 30. The vehicle detection system of claim 26, wherein the processor is further configured to estimate a height of the vehicle.
0. 31. The vehicle detection system of claim 26, wherein the processor is configured to estimate a volume of the vehicle based at least in part on the vehicle measurements.
0. 32. The vehicle detection system of claim 26 wherein the processor is configured to identify a corner point of the vehicle less than a threshold distance from points on both of the first and second lines.
0. 33. The vehicle detection system of claim 32, wherein the processor is configured to define a three-dimensional bounding box corresponding to the vehicle based on detection of corners.
0. 34. The vehicle detection system of claim 33, wherein the three-dimensional bounding box represents an estimate of bounding dimensions of the vehicle.
0. 35. The vehicle detection system of claim 34, wherein the processor is further configured to refine the estimate of the bounding dimensions as the light detection and ranging (LIDAR) trace is produced by reflection of the illumination signals from an increasing number of sides of the vehicle.
0. 36. The vehicle detection system of claim 26, wherein the light detection and ranging (LIDAR) trace includes reflection of the illumination signals from a complete side of the vehicle, and wherein the processor is configured to determine dimensions of a three-dimensional bounding box corresponding to the vehicle based at least on full-waveform signal processing of the signal waveforms from the vehicle measurements for a complete side of the vehicle.
0. 37. The vehicle detection system of claim 26, wherein the processor is configured to account for changes in the distance between the vehicle and the 3d optical emitter due to relative movement between the 3d optical emitter and the vehicle.
0. 38. The vehicle detection system of claim 26, wherein the processor is configured to assign a classification to the vehicle based on a dimension of the vehicle.
0. 39. The vehicle detection system of claim 38, wherein the classification is according to a classification scheme distinguishing two-wheeled and four-wheeled vehicles.
0. 40. The vehicle detection system of claim 26, wherein the processor is configured to trigger an event based at least in part on a dimension of the vehicle.

where −π<α≤π is the angle between the x axis and the normal of the line, r≥0 is the perpendicular distance of the line to the origin; (x, y) is the Cartesian coordinates of a point on the line. The covariance matrix of line parameters is:

cov ( r , α ) = [ σ r 2 σ r α σ r α σ α 2 ]

FIG. 19 shows a state diagram for the 3D real-time detection multi-object tracker. The core of the tracker 91A is based on a Kalman Filter in all weather and lighting conditions. The observation model 90 is illustrated in FIG. 21 which presents an example method to compute the vehicle position by weighting each 3D observation according to its height amplitude. This method permits to improve the accuracy of the estimated position with respect to using only the x and y Cartesian positions.

Expression 301 computes the blob position as follows:
Pblobn=1Nπn·Pn

where πn is the intensity weight for the observation n, nϵ{1, . . . , N}, and N is the number of observation grouped together. Step 301 is followed by computing the observation weight depending on the intensity at step 302.

The function 300 normalizes the weight πn according to the amplitude An of the observation Pn:

π n = A n Σ A n

The state evolution model 92 is represented by the classical model called speed constant. Kinematics model can be represented in a matrix form by:
pk+1=F·pk+G·Vk, Vk˜N(0,Qk)

where pk=(xobs,{dot over (x)}obs,yobs,{dot over (y)}obs) is the target state vector, F the transition matrix which models the evolution of pk, Qk the covariance matrix of Vk, and G the noise matrix which is modeled by acceleration.

F = [ 1 ΔT 0 0 0 1 0 0 0 0 1 Δ T 0 0 0 1 ] G = [ Δ T 2 2 0 ΔT 0 0 Δ T 2 2 0 ΔT ] Q k = [ σ x 2 0 0 σ y 2 ]

The equation observation can be written as:
Zk=H·pk+Wk, Wk˜N(0,Rk)

where Zk=(xobsk,yobsk)t is the measurement vector, H the measurement sensitivity matrix, and Rk the covariance matrix of Wk.

H = [ 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 ] R k = [ σ obs x 2 0 0 σ obs y 2 ]

The state space model 93A is based on probabilistic framework where the evolution model is supposed to be linear and the observation model is supposed to be Gaussian noise. In a 3D image, the system state encodes the information observed in the scene, e.g. the number of vehicles and their characteristics is xkN=(pkN, lkN) with N as the number of detected vehicles, where pkN denotes the 2D position of object N at iteration k, lkN gives identification, age, lane and the object classification.

FIG. 20 shows a state diagram for 3D real-time detection multi-object joint tracker. The core of 91B is based on a Kalman Filter which addresses the issue of interacting targets, which cause occlusion issues. When an occlusion is present, 3D data alone can be unreliable, and is not sufficient to detect, at each frame, the object of interest. If the algorithm uses the traffic light state 85, occlusions can be modeled with a joint state space model 93B. The multi-object joint tracker includes a multi-object interaction distance which is implemented by including an additional interaction factor in the vehicle position. The state space model 93B encodes the observations detected in the scene, e.g. the number of vehicles, the traffic light state and the interaction between the vehicles located in the same lane by concatenating their configurations into a single super-state vector such as: Xk=(Ok, xk1, . . . , xkN) with Ok the size of state space at iteration k and xkN=(pkN, lkN) the state vector associated with the object N, where pkN denotes the 2D position of the object N at iteration k, lkN gives identification, age, lane, class, traffic light state and the object interaction.

Before integrating measures into the filter, a selection is made by a two-step procedure shown in FIGS. 22 and 23: first at step 400 validation gate, then at step 401A/B data association. The validation gate is the ellipsoid of size Nz (dimension of vector) defined such as:
θt·S−1·θ≤γ

where θt=Zkcustom character is the innovation, S the covariance matrix of the predicted value of the measurement vector and γ is obtained from the chi-square tables for Nz degree of freedom. This threshold represents the probability that the (true) measurement will fall in the gate. Step 400 is followed by step 401A/B which makes the matching between a blob and a hypothesis. Then, (i) consider all entries as new blobs; (ii) find the corresponding entries to each blob by considering gating intervals around the predicted position of each hypothesis, (iii) choose the nearest entry of each interval as the corresponding final observation of each blob. At step 402, the tracking algorithm uses a track management module in order to change the number of hypothesis. This definition is: (i) if, considering the existing assumption, there occurs an observation that cannot be explained, the track management module proposes a new observation; (ii) if an assumption does not find any observation after 500 ms, the track management module proposes to suppress the assumption. In this case, of course, an evolution model helps to guide state space exploration of the Kalman filter algorithm with a prediction of the state. Finally, step 403 uses a Kalman framework to estimate the final position of the vehicle.

In a 3D image, the system state encodes the information observed in the scene, the number of vehicles and their characteristics is Xk=(Ok, xk1, . . . , xkN) with Ok the size of state space (number of detected vehicles) at iteration k and xkN=(pkN, lkN) the state vector associated with object N, where pkN denotes the 2D position of object N at iteration k, lkN gives identification, age, lane and the object classification. Step 90 and 92 are unchanged.

FIG. 24 shows the steps performed during the execution of the classification algorithm. At step 500, the algorithm checks if a line is detected in the 3D image. If a line is detected, step 500 is followed by step 501 which computes vehicle length. Vehicle length is defined as the overall length of the vehicle (including attached trailers) from the front to the rear. In order to calculate the length, two different positions are used: X0 and X1 . . . X0 is given by the position of the first detected line and X1 is given by the trigger line 1 (for example). Once the speed has been estimated, the vehicle length l can be determined such as:

l[m]=s[m/S]*(X1(t)[s]−X0(t)[s])−(X1(x)[m])−X0(x)[m])+Seg[m]+TH[m] where s is the vehicle speed, Seg is the length of the detected line and TH is a calibration threshold determined from a large dataset.

If the line is not detected at step 500, step 500 is followed by step 502 which computes the vehicle height. The vehicle height is estimated during the entry into the sensor field of view. As shown in FIG. 26, for a known configuration of the detection system, there is a direct geometric relationship between the height of a vehicle 601 and the detection distance 600. The accuracy 602 is dependent on the half-size of the vertical FOV angle 603. Height measurement is validated if the accuracy is lower than a threshold.

Finally, step 502 is followed by step 503 which computes the vehicle width. Over the vehicle blob, let (yl, x) be leftmost pixel and (yr, x) be the rightmost pixel in the vehicle blob for a given x. Then the width w of the object is determined from the following formula:
w=|yr−yl|

FIGS. 25A, 25B and 25C show a result of vehicle classification based on the classification algorithm. For example, in FIG. 25A, the classification result is a heavy vehicle; in FIG. 25B, it is a four-wheeled lightweight vehicle and in FIG. 25C, it is a two-wheeled lightweight vehicle. The information from the detection system is flexible and can be adapted to different schemes of classification. FIG. 25 illustrates graphically the basic elements of the concept of an object-box approach which is detailed below and in FIG. 27 and FIG. 28.

The object-box approach is mainly intended for vehicles because this approach uses the vehicle geometry in a LEDDAR image. The vehicles are represented by a 3D rectangular box of detected length, width and height. The 3D size of the rectangular box will vary depending on the detections in the FOV. FIGS. 27A, 27B, 27C and 27D show top view frames of a vehicle detected by the LEDDAR sensor. FIGS. 28A, 28B, 28C and 28D show corresponding side view frames of the vehicle of FIG. 27.

FIGS. 27A, 27B, 27C, 27D and FIGS. 28A, 28B, 28C, 28D show the changing 3D size of the rectangle 701 for four example positions of a vehicle 702 in the 3D sensor FOV 703. When a vehicle 702 enters the 3D sensor FOV 703, two detections are made on the side of the vehicle (see FIG. 27A) and one detection is made for the top of the vehicle (see FIG. 28A). The 3D rectangle is initialized with a length equal to 4 m, a width of 1.5 m and a height OHm given by:
OHm=Hs−dist*tan(θ)

where Hs is the sensor height 704, dist is the distance of the detected vehicle and θ is sensor pitch.

FIG. 27B and FIG. 28B represent detections when the vehicle is three-fourths of the way in the detection FOV. Eight side detections are apparent on FIG. 27B and one top detection is apparent on FIG. 28B. The dimensions of the 3D rectangle are calculated as follows:

The width is not yet adjusted because the vehicle back is not yet detected.
Ol(k)=max(L2−L1,Ol(k−1))
Oh(k)=max(OHm,Oh(k−1))

where the points of a segment are clockwise angle sorted so L2 is the point with the smallest angle and L1 is the segment-point with the largest angle. Ol(k) and Oh(k) are respectively the current length and height value at time k.

FIG. 27C and FIG. 28C represent detections when the back of the vehicle begins to enter in the detection FOV. Eight side detections and two rear detections are apparent on FIG. 27C while one detection is apparent on FIG. 28C. The dimensions of the 3D rectangle are calculated as follows:
Ol(k)=max(L2−L1,Ol(k−1))
Oh(k)=max(OHm,Oh(k−1))
Ow(k)=max(L4−L3,Ow(k−1))

As for the horizontal segment representing the side of the vehicle, the points of the vertical segment representing the rear and/or the top of the vehicle are clockwise angle sorted, so L4 is the point with the smallest angle and L3 is the segment-point with the largest angle. Ol(k), Oh(k) and Ow(k) are respectively the current length, height and width value at time k.

FIG. 27D and FIG. 28D represent detections when the back of the vehicle is fully in the detection FOV. Six side detections and four rear detections are apparent on FIG. 27D while one detection is apparent on FIG. 28D. The width Olm dimension is calculated as follows:
Olm(k)=α*(L4−L3)+(1−α)*Olm(k−1)

where Olm(k) is the current width at time k and α is the filtering rate.

The size of the vehicle can then be determined fully.

The segmentation algorithm 800 based on a 3D bounding box for selection of the relevant measures is illustrated in FIG. 29. The first three steps are identical to that of FIG. 17. If step 120 finds horizontal lines, then step 120 is followed by step 121. As explained above, the points of a segment are clockwise angle sorted with L2, the smallest angle and L1 the largest angle. This segment length is given by L2−L1. Otherwise, the next step 123 initializes the 3D bounding box with a default vehicle length. Step 121 is followed by step 122 which considers that two segments have a common corner if there is a point of intersection Pi between the two segments with |Pi−L1| and |Pi−L4| less than a distance threshold. If no corner is found, step 123 initializes the 3D bounding box with default values. Otherwise, step 124 computes the 3D bounding box dimensions from equations presented above with respect to FIG. 27C.

It is of interest to derive minimum variance bounds on estimation errors to have an idea of the maximum knowledge on the speed measurement that can be expected and to assess the quality of the results of the proposed algorithms compared with the bounds. In time-invariant statistical models, a commonly used lower bound is the Cramer-Rao Lower Bound (CRLB), given by the inverse of the Fisher information matrix. The PCRB can be used for estimating kinematic characteristics of the target.

A simulation was done according to the scenario shown in FIG. 30. The vehicle 130 is moving at a speed of 60 m/s along a straight line in lane 3. The PCRB was applied. As shown in FIG. 31, the tracking algorithm converges at point 903 at about σ{dot over (K)}F=0.48 km/h after 80 samples. From point 900, it is apparent that after 16 samples, σ{dot over (K)}F<3 km/h, from point 901 that after 28 samples, σ{dot over (K)}F<1.5 km/h and from point 902 that after 39 samples, σ{dot over (K)}F<1 km/h. Experimental tests confirmed the utility and viability of this approach.

Image Processing and Applications

The multipurpose traffic detection system uses a high-resolution image sensor or more than one image sensor with lower resolution. In the latter case, the control and processing unit has to process an image stitching by combining multiple images with different FOVs with some overlapping sections in order to produce a high-resolution image. Normally during the calibration process, the system can determine exact overlaps between images sensors and produce seamless results by controlling and synchronizing the integration time of each image sensor and the illumination timing and analyzing overlap sections. Infrared and color image sensors can be used with optical filters.

At night, a visible light is required to enhance the color of the image. A NIR flash is not visible to the human eye and does not blind drivers, so it can be used at any time of the day and night.

Image sensors can use electronic shutters (global or rolling) or mechanical shutters. In the case of rolling shutters, compensation for the distortions of fast-moving objects (skew effect) can be processed based on the information of the position and the speed of the vehicle. Other controls of the image sensor like Gamma and gain control can be used to improve the quality of the image in different contexts of illumination.

FIG. 32A is a photograph showing an example snapshot taken by a 5 Mpixels image sensor during the day. Vehicles are at a distance of approximately 25 m and the FOV at that distance covers approximately 9 m (almost equivalent to 3 lanes). FIGS. 32B, 32C and 32D show the quality of the image and resolution of FIG. 32A by zooming in on the three license plates.

FIG. 33A is a photograph showing an example snapshot taken by the image sensor at night without any light. This image is completely dark. FIG. 33B shows the same scene with infrared light. Two vehicles can be seen but the license plates are not readable even when zooming in as seen in FIG. 33C. The license plate acts as a retro-reflector and saturates the image sensing. FIGS. 34A and 34B use the same lighting with a lower integration time. The vehicle is less clear but the image shows some part of the license plate becoming less saturated. FIGS. 34C and 34D decrease a little more the integration time and produce a readable license plate.

One way to get a visible license plate at night and an image of the vehicle is to process several snapshots with different integration times (Ti). For example, when the 3D detection confirms the position of a vehicle in the detection zone, a sequence of acquisition of several snapshots (ex.: 4 snapshots with Ti1=50 μs, Ti2=100 μs, Ti3=250 μs and Ti4=500 μs), each snapshot taken at a certain frame rate (ex.: each 50 ms), will permit to get the information on a specific vehicle: information from the 3D sensor, a readable license plate of the tracked vehicle and an image from the context including the photo of the vehicle. If the system captures 4 images during 150 ms, a vehicle at 150 km/h would travel during 6.25 m (one snapshot every 1.5 m).

To enhance the quality of the image, high dynamic range (HDR) imaging techniques can be used to improve the dynamic range between the lightest and darkest areas of an image. HDR notably compensates for loss of information by a saturated section by taking multiple pictures at different integration times and using stitching process to make a better quality image.

The system can use Automatic License Plate Recognition (ALPR), based on Optical Character Recognition (OCR) technology, to identify vehicle license plates. This information of the vehicle identification and measurements is digitally transmitted to the external controller or by the network to back-office servers, which process the information and can traffic violation alerts.

The multipurpose traffic detection system can be used day or night, in good or bad weather condition, and also offers the possibility of providing weather information like the presence of fog or snowing conditions. Fog and snow have an impact on the reflection of the radiated light pulses of the protective window. In the presence of fog, the peak amplitude of the first pulse exhibits sizable time fluctuations, by a factor that may reach 2 to 3 when compared to its mean peak amplitude level. Likewise, the width of the first pulse also shows time fluctuations during these adverse weather conditions, but with a reduced factor, for example, by about 10 to 50%. During snow falls, the peak amplitude of the first pulse visible in the waveforms generally shows faster time fluctuations while the fluctuations of the pulse width are less intense. Finally, it can be noted that a long-lasting change in the peak amplitude of the first pulse can be simply due to the presence of dirt or snow deposited on the exterior surface of the protective window.

FIG. 35 shows an example image taken with infrared illumination with the overlay (dashed lines) representing the perimeter of the 16 contiguous detection zones of the 3DOR. Apparent on FIG. 35 are high intensity spots 140 coming from a section of the vehicle having a high retro-reflectivity characteristic. Such sections having a high retro-reflectivity characteristic include the license plate, retro-reflectors installed one the car and lighting modules that can include retro-reflectors. An object with retro-reflectivity characteristic reflects light back to its source with minimum scattering. The return signal can be as much as 100 times stronger than a signal coming from a surface with Lambertian reflectance. This retro-reflectivity characteristic has the same kind of impact on the 3DOR. Each 3D channel detecting a retro-reflector at a certain distance in its FOV will acquire a waveform with high peak amplitude at the distance of the retro-reflector. The numbers at the bottom of the overlay (in dashed lines) represent the distance measured by the multipurpose traffic detection system in each channel which contains a high peak in its waveform. Then, with a good image registration between the 2D image sensor and the 3D sensor, the 2D information (spot with high intensity) can be correlated with the 3D information (high amplitude at a certain distance). This link between 2D images and 3D detection ensures a match between the identification data based on reading license plates and measurements of position and velocity from the 3D sensor.

The license plate identification process can also be used as a second alternative to determine the speed of the vehicle with lower accuracy but useful as a validation or confirmation. By analyzing the size of the license plate and/or character on successive images, the progression of the vehicle in the detection zone can be estimated and used to confirm the measured displacement.

The embodiments described above are intended to be exemplary only. The scope of the invention is therefore intended to be limited solely by the appended claims.

Mimeault, Yvan, Gidel, Samuel

Patent Priority Assignee Title
11668830, Jun 01 2018 VAYAVISION SENSING LTD System and method for performing active distance measurements
11725956, Apr 01 2015 VayaVision Sensing Ltd. Apparatus for acquiring 3-dimensional maps of a scene
12093834, Sep 22 2019 VAYAVISION SENSING LTD Methods and systems for training and validating a perception system
Patent Priority Assignee Title
3680085,
3967111, Dec 20 1974 Scientific Technology Incorporated Pulsed light source discriminator system
4533242, Sep 28 1982 The United States of America as represented by the Administrator of the Ranging system which compares an object-reflected component of a light beam to a reference component of the light beam
4634272, Jun 02 1982 Nissan Motor Company, Limited Optical radar system with an array of photoelectric sensors
4717862, Nov 19 1984 The United States Government as represented by the Secretary of the Navy Pulsed illumination projector
4733961, Mar 07 1983 RAYTHEON COMPANY, A CORPORATION OF DELAWARE Amplifier for integrated laser/FLIR rangefinder
4808997, May 21 1987 BARKLEY ASSOCIATES, INC , A CORP OF IL Photoelectric vehicle position indicating device for use in parking and otherwise positioning vehicles
4891624, Jun 12 1987 Stanley Electric Co., Ltd. Rearward vehicle obstruction detector using modulated light from the brake light elements
4928232, Dec 16 1988 Laser Precision Corporation Signal averaging for optical time domain relectometers
5102218, Aug 15 1991 UNITED STATES OF AMERICA, THE, AS REPRESENTED BY THE SECRETARY OF THE AIR FORCE Target-aerosol discrimination by means of digital signal processing
5134393, Apr 02 1990 Traffic control system
5179286, Oct 05 1990 Mitsubishi Denki K.K. Distance measuring apparatus receiving echo light pulses
5270780, Sep 13 1991 SCIENCE APPLICATIONS INTERNATIONAL CORPORATION, A CORP OF DE Dual detector lidar system and method
5317311, Nov 14 1988 TRAFFICMASTER PLC, OF LUTON INTERNATIONAL AIRPORT Traffic congestion monitoring system
5357331, Jul 02 1991 Lockheed Martin Corp System for processing reflected energy signals
5381155, Dec 08 1993 Vehicle speeding detection and identification
5389921, May 17 1993 Parking lot apparatus and method
5621518, Nov 26 1994 Keysight Technologies, Inc Optical time domain reflectometer (OTDR) with improved dynamic range and linearity
5629704, Sep 12 1994 Nissan Motor Co., Ltd. Target position detecting apparatus and method utilizing radar
5633629, Feb 08 1995 Relume Technologies, Inc Traffic information system using light emitting diodes
5633801, Oct 11 1995 Fluke Corporation Pulse-based impedance measurement instrument
5714754, Mar 04 1994 Remote zone operation of lighting systems for above-ground enclosed or semi-enclosed parking structures
5760686, Feb 14 1994 Assembly and method for detecting errant vehicles and warning work zone personnel thereof
5760887, Apr 30 1996 Raytheon Company Multi-pulse, multi-return, modal range processing for clutter rejection
5764163, Sep 21 1995 Electronics & Space Corporation Non-imaging electro-optic vehicle sensor apparatus utilizing variance in reflectance
5777564, Jun 06 1996 Traffic signal system and method
5793491, Dec 30 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Intelligent vehicle highway system multi-lane sensor and method
5805468, May 09 1995 Sick AG Method and apparatus for determining the light transit time over a measurement path arranged between a measuring apparatus and a reflecting object
5812249, Sep 26 1996 ENVIROTEST SYSTEMS HOLDINGS CORP Speed and acceleration monitoring device using visible laser beams
5828320, Sep 26 1997 TRIGG PROPERTIES, LLC Vehicle overheight detector device and method
5836583, Apr 26 1994 TCS John Huxley Europe Limited Detection system for detecting a position of a ball on a roulette wheel
5838116, Apr 15 1996 Technical Consumer Products, Inc Fluorescent light ballast with information transmission circuitry
5889477, Mar 25 1996 Sirius XM Connected Vehicle Services Inc Process and system for ascertaining traffic conditions using stationary data collection devices
5896190, Nov 23 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Intelligent vehicle highway system sensor and method
5942753, Mar 12 1997 ENVIROTEST SYSTEMS HOLDINGS CORP Infrared remote sensing device and system for checking vehicle brake condition
5953110, Apr 23 1998 H.N. Burns Engineering Corporation Multichannel laser radar
5963127, Oct 07 1996 MEKRA LANG GMBH & CO KG Control equipment for difficult to see or blind spot areas around vehicles, and related method
5995900, Jan 24 1997 Northrop Grumman Systems Corporation Infrared traffic sensor with feature curve generation
6044336, Jul 13 1998 Northrop Grumman Systems Corporation Method and apparatus for situationally adaptive processing in echo-location systems operating in non-Gaussian environments
6094159, Feb 07 1998 ITT Manufacturing Enterprises, Inc Process for measuring distance with adaptive amplification
6100539, Jan 20 1997 Sick AG Light sensor with evaluation of the light transit time
6104314, Feb 10 1998 Automatic parking apparatus
6107942, Feb 03 1999 Premier Management Partners, Inc. Parking guidance and management system
6115113, Dec 02 1998 Lockheed Martin Corporation Method for increasing single-pulse range resolution
6142702, Nov 25 1998 Parking space security and status indicator system
6147624, Jan 31 2000 Intel Corporation Method and apparatus for parking management system for locating available parking space
6166645, Jan 13 1999 Road surface friction detector and method for vehicles
6259515, Feb 07 1998 VALEO Schalter und Sensoren GmbH Evaluation concept for distance measuring processes
6259862, Apr 11 1995 Eastman Kodak Company Red-eye reduction using multiple function light source
6266609, Dec 02 1998 DDG Gesellschaft fur Verkehrsdaten mbH Parking space detection
6281632, Sep 18 1998 Gentex Corporation Continuously variable headlamp control
6285297, May 03 1999 Determining the availability of parking spaces
6304321, Nov 23 1992 WELLS FARGO BANK, NATIONAL ASSOCIATION, AS ADMINISTRATIVE AGENT Vehicle classification and axle counting sensor system and method
6340935, Feb 05 1999 Computerized parking facility management system
6363326, Nov 05 1997 Method and apparatus for detecting an object on a side of or backwards of a vehicle
6377167, Jul 22 1997 AUTOSENSE LLC Multi frequency photoelectric detection system
6388565, May 08 1999 Daimler AG Guidance system for assisting lane change of a motor vehicle
6404506, Mar 09 1998 Regents of the University of California, The Non-intrusive laser-based system for detecting objects moving across a planar surface
6411221, Feb 27 1997 Device and method to detect an object in a given area, especially vehicles, for the purpose of traffic control
6417783, Feb 05 1997 Siemens Aktiengesellschaft Motor vehicle detector
6426708, Jun 30 2001 Koninklijke Philips Electronics N.V. Smart parking advisor
6502011, Jul 30 1999 ROBOTIC PARKING, INC ; ROBOTIC TECHNOLOGY ADMINISTRATION, LLC Method and apparatus for presenting and managing information in an automated parking structure
6502053, Jun 12 2000 Larry, Hardin Combination passive and active speed detection system
6516286, Apr 04 2000 Leica Geosystems AG Method for measuring the distance to at least one target
6548967, Aug 26 1997 PHILIPS LIGHTING NORTH AMERICA CORPORATION Universal lighting network methods and systems
6556916, Sep 27 2001 ZIONS BANCORPORATION, N A DBA ZIONS FIRST NATIONAL BANK System and method for identification of traffic lane positions
6559776, Feb 15 2001 Parking status control system and method
6580385, May 26 1999 Robert Bosch GmbH Object detection system
6642854, Jun 14 2000 Electronic car park management system
6650250, May 21 2001 Seiko Epson Corporation Parking lot guidance system and parking lot guidance program
6665621, Nov 28 2000 OMRON SCIENTIFIC TECHNOLOGIES, INC System and method for waveform processing
6674394, Mar 28 2003 Veoneer US, LLC Method for determining object location from side-looking sensor data
6753766, Mar 07 2001 1138037 Ontario Ltd. ("Alirt") Detecting device and method of using same
6753950, Jan 26 2000 Instro Precision Limited Optical distance measurement
6765495, Jun 07 2000 HRL Laboratories, LLC Inter vehicle communication system
6771185, Feb 03 1999 Parking guidance and management system
6794831, Apr 14 1999 PHILIPS LIGHTING HOLDING B V Non-flickering illumination based communication
6821003, Jul 16 2002 THE BANK OF NEW YORK MELLON, AS ADMINISTRATIVE AGENT Vehicle lamp and vehicle illumination and data transmission system incorporating same
6825778, Oct 21 2002 International Road Dynamics Inc. Variable speed limit system
6831576, Nov 13 2002 Robert Bosch GmbH A/D converter with improved resolution by sampling a superposed signal
6836317, May 19 1998 Method for optically measuring distance
6842231, Sep 30 2002 Raytheon Company Method for improved range accuracy in laser range finders
6850156, Nov 15 1999 Donnelly Corporation Anti-collision safety system for vehicle
6885311, Feb 07 2001 INTERCOMP, S P A Parking management systems
6885312, May 28 2002 Bellsouth Intellectual Property Corporation Method and system for mapping vehicle parking
6917307, May 08 2003 COLIGEN CORP Management method and system for a parking lot
6927700, Jan 04 2000 Method and apparatus for detection and remote notification of vehicle parking space availability data
6946974, Aug 28 1999 OPEN PARKING, LLC Web-based systems and methods for internet communication of substantially real-time parking data
7026954, Jun 10 2003 Bellsouth Intellectual Property Corporation Automated parking director systems and related methods
7049945, May 08 2000 AMERICAN VEHICULAR SCIENCES LLC Vehicular blind spot identification and monitoring system
7081832, Apr 25 2003 INTERPARK LLC Method and apparatus for obtaining data regarding a parking location
7106214, Apr 06 2004 Apparatus and method of finding an unoccupied parking space in a parking lot
7116246, Oct 03 2001 Apparatus and method for sensing the occupancy status of parking spaces in a parking lot
7119674, May 22 2003 NEOLOGY, INC Automated site security, monitoring and access control system
7119715, Mar 31 2004 Honda Motor Co., Ltd. Parking lot attendant robot system
7123166, Oct 08 2004 Method for managing a parking lot
7135991, Jun 10 2003 BellSouth Intellectual Property Automated parking director systems and related methods
7148813, Mar 20 2003 Gentex Corporation Light emitting traffic sign having vehicle sensing capabilities
7209221, May 08 2000 AMERICAN VEHICULAR SCIENCES LLC Method for obtaining and displaying information about objects in a vehicular blind spot
7221271, Oct 31 2002 Device for controlling lighting for the interiors of automotive vehicles and method for controlling said device
7221288, Oct 25 2004 The Chamberlain Group, Inc. Method and apparatus for using optical signal time-of-flight information to facilitate obstacle detection
7236102, Apr 15 2004 Denso Corporation Vehicle information system for a loop intersection
7250605, Mar 21 2005 Tyco Fire & Security GmbH Passive infra-red detectors
7253747, Sep 11 2003 LAPIS SEMICONDUCTOR CO , LTD Parking lot management system using wireless LAN system
7317384, May 07 2003 Peugeot Citroen Automobiles SA Optical exploration device and vehicle comprising said device
7319777, Apr 04 2001 Instro Precision Limited Image analysis apparatus
7321317, Nov 07 2003 Nattel Group, Inc. Method for intelligent parking/pollution and surveillance control system
7350945, Jan 09 2004 Valeo Vision System and method of detecting driving conditions for a motor vehicle
7352972, Jan 02 1997 CONVERGENCE WIRELESS, INC Method and apparatus for the zonal transmission of data using building lighting fixtures
7359782, May 23 1994 AMERICAN VEHICULAR SCIENCES LLC Vehicular impact reactive system and method
7378947, Jul 17 2002 ADASENS AUTOMOTIVE GMBH Device and method for the active monitoring of the safety perimeter of a motor vehicle
7405676, Sep 10 2004 GATSOMETER B V Method and system for detecting with laser the passage by a vehicle of a point for monitoring on a road
7417718, Oct 28 2005 Sharp Kabushiki Kaisha Optical distance measuring apparatus
7426450, Jan 10 2003 ZIONS BANCORPORATION, N A DBA ZIONS FIRST NATIONAL BANK Systems and methods for monitoring speed
7486204, Feb 23 2005 Warning alert system and method for pedestrians
7492281, Jul 06 2005 Donnelly Corporation Vehicle exterior mirror assembly with blind spot indicator
7504932, Oct 10 2002 Volkswagen AG Method and device for monitoring blind spots of a motor vehicle
7554652, Feb 29 2008 LEDDARTECH INC Light-integrating rangefinding device and method
7573400, Oct 31 2005 ZIONS BANCORPORATION, N A DBA ZIONS FIRST NATIONAL BANK Systems and methods for configuring intersection detection zones
7616293, Apr 29 2004 Optotraffic, LLC System and method for traffic monitoring, speed determination, and traffic light violation detection and recording
7633433, Oct 11 2007 JENOPTIK Robot GmbH Method for detecting and documenting traffic violations at a traffic light
7635854, Jul 09 2008 LEDDARTECH INC Method and apparatus for optical level sensing of agitated fluid surfaces
7640122, Nov 07 2007 LEDDARTECH INC Digital signal processing in optical systems used for ranging applications
7652245, Oct 25 2005 OPTASENSE HOLDINGS LTD Traffic sensing and monitoring apparatus
7688222, Feb 13 2004 Spot Devices, Inc Methods, systems and devices related to road mounted indicators for providing visual indications to approaching traffic
7725348, Oct 17 2001 Transcore, LP Multilane vehicle information capture system
7734500, Oct 17 2001 Transcore, LP Multiple RF read zone system
7760111, Jul 06 2005 Donnelly Corporation Vehicle exterior mirror assembly with blind spot indicator
7764193, Nov 24 2006 Hon Hai Precision Industry Co., Ltd. Traffic regulation system
7796081, Apr 09 2002 AMERICAN VEHICULAR SCIENCES LLC Combined imaging and distance monitoring for vehicular applications
7808401, Jan 11 2008 GARRISON LOAN AGENCY SERVICES LLC Light emitters for optical traffic control systems
7852462, May 08 2000 AMERICAN VEHICULAR SCIENCES LLC Vehicular component control methods based on blind spot monitoring
7855376, Dec 19 2005 LEDDARTECH INC Lighting system and method for illuminating and detecting object
7859432, May 23 2007 Che Il Electric Wireing Devices Co., Ltd.; Che Il Electronics Ind. Co., Ltd. Collision avoidance system based on detection of obstacles in blind spots of vehicle
7872572, Sep 17 2008 International Business Machines Corporation Method and system for vehicle mounted infrared wavelength information displays for traffic camera viewers
7889097, Dec 19 2005 ZIONS BANCORPORATION, N A DBA ZIONS FIRST NATIONAL BANK Detecting targets in roadway intersections
7889098, Dec 19 2005 ZIONS BANCORPORATION, N A DBA ZIONS FIRST NATIONAL BANK Detecting targets in roadway intersections
7895007, Nov 07 2007 LEDDARTECH INC Digital signal processing in optical systems used for ranging applications
7898433, Mar 29 2007 Traffic control system
7917320, Nov 07 2007 LEDDARTECH INC Digital signal processing in optical systems used for ranging applications
7933690, Feb 23 2005 Honda Motor Co., Ltd. Vehicle recognition allowing device
7952491, Jan 11 2008 GARRISON LOAN AGENCY SERVICES LLC Optical traffic control system with burst mode light emitter
7957900, Feb 08 2008 Tracking vehicle locations in a parking lot for definitive display on a GUI
8242476, Dec 19 2005 LEDDARTECH INC LED object detection system and method combining complete reflection traces from individual narrow field-of-view channels
8331621, Oct 17 2001 Transcore, LP Vehicle image capture system
8436748, Jun 18 2007 LeddarTech inc. Lighting system with traffic management capabilities
8593519, Feb 06 2007 Denso Corporation Field watch apparatus
8600656, Jun 18 2007 LeddarTech inc. Lighting system with driver assistance capabilities
8761447, Nov 09 2010 ACF FINCO I LP Sustainable outdoor lighting system for use in environmentally photo-sensitive area
8823951, Jul 23 2010 LEDDARTECH INC 3D optical detection system and method for a mobile storage system
8924140, Apr 14 2009 Hitachi, LTD External environment recognition device for vehicle and vehicle system using same
8964031, Aug 17 2009 NEOLOGY, INC Method and system for measuring the speed of a vehicle
9235988, Mar 02 2012 LEDDARTECH INC System and method for multipurpose traffic detection and characterization
20020005778,
20020033884,
20020117340,
20020141618,
20030154017,
20030189500,
20040035620,
20040051859,
20040083035,
20040118624,
20040135992,
20040254728,
20050046597,
20050078297,
20050117364,
20050187701,
20050231384,
20050232469,
20050269481,
20050270175,
20050285738,
20060033641,
20060066472,
20060145824,
20060147089,
20060149472,
20060180670,
20060203505,
20060221228,
20070018106,
20070061192,
20070091294,
20070096943,
20070181786,
20070181810,
20070205918,
20070222639,
20070228262,
20070255525,
20080006762,
20080166023,
20080172171,
20080186470,
20080245952,
20080278366,
20080309914,
20090027185,
20090102699,
20090243822,
20090251680,
20090267784,
20090299631,
20090323741,
20100066527,
20100117812,
20100141765,
20100191418,
20100194595,
20100214554,
20100277713,
20100309024,
20110006188,
20110025843,
20110026007,
20110026008,
20110115409,
20110115645,
20110134249,
20110205521,
20110235028,
20120038902,
20120229304,
20120287417,
20120307065,
20140232566,
CA2633377,
CA2710212,
CA2782180,
CN2857132,
DE102004016025,
DE102004035856,
DE102006025020,
DE102007038973,
DE102008043880,
DE102009013841,
DE102010012811,
DE10247290,
DE19604338,
DE19823135,
DE19921449,
DE202005010816,
DE202008007078,
DE29617413,
DE69710579,
EP259445,
EP318260,
EP476562,
EP494815,
EP612049,
EP779990,
EP784302,
EP789342,
EP798684,
EP834424,
EP838695,
EP866434,
EP904552,
EP912970,
EP935764,
EP988624,
EP1034522,
EP1048961,
EP1049064,
EP1052143,
EP1220181,
EP1224632,
EP1296302,
EP1334869,
EP1435036,
EP1521226,
EP1542194,
EP1611458,
EP1859990,
EP1997090,
EP2106968,
EP2136550,
EP2306426,
EP2393295,
FR2690519,
FR2743150,
FR2743151,
FR2749670,
FR2910408,
GB2264411,
GB2311265,
GB2354898,
GB2369737,
GB2399968,
GB2431498,
GB2445767,
JP2004102889,
JP2005170184,
JP2006021720,
JP2006172210,
JP2006258598,
JP2006507180,
JP2006521536,
JP2007121116,
JP2059608,
JP4145390,
JP4145391,
JP4172285,
JP7280940,
JP9178786,
WO139153,
WO185491,
WO215334,
WO3000520,
WO3007269,
WO2004010402,
WO2004027451,
WO2004036244,
WO2004039631,
WO2004100103,
WO2005008271,
WO2005072358,
WO2006031220,
WO2006044758,
WO2006082502,
WO2006092659,
WO2007005942,
WO2007071032,
WO2007096814,
WO2008037049,
WO2008121648,
WO2008154736,
WO2008154737,
WO2009013739,
WO2009079789,
WO2009087536,
WO2009104955,
WO2009117197,
WO2010033024,
WO2010057697,
WO2010069002,
WO2010122284,
WO2010144349,
WO2011015817,
WO2011025563,
WO2011055259,
WO2011077400,
WO2012153309,
WO2012172526,
WO2013128427,
WO8705138,
WO9203808,
WO9634252,
WO9904378,
WO2009079789,
WO2011077400,
//////
Executed onAssignorAssigneeConveyanceFrameReelDoc
May 07 2013MIMEAULT, YVAN LEDDARTECH INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0458560192 pdf
May 07 2013GIDEL, SAMUEL LEDDARTECH INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0458560192 pdf
Jan 11 2018LeddarTech inc.(assignment on the face of the patent)
Jan 23 2020LEDDARTECH INC FEDERATION DES CAISSES DESJARDINS DU QUEBECSECURITY CONFIRMATION AGREEMENT0539800942 pdf
Jan 30 2020LEDDARTECH INC INVESTISSEMENT QUÉBECSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0517130622 pdf
Feb 05 2021LEDDARTECH INC INVESTISSEMENT QUÉBECSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0552440208 pdf
Date Maintenance Fee Events
Jan 11 2018BIG: Entity status set to Undiscounted (note the period is included in the code).
Jan 11 2018BIG: Entity status set to Undiscounted (note the period is included in the code).
Jan 31 2018SMAL: Entity status set to Small.
Jan 31 2018SMAL: Entity status set to Small.
Jun 14 2023BIG: Entity status set to Undiscounted (note the period is included in the code).
Jun 14 2023M1552: Payment of Maintenance Fee, 8th Year, Large Entity.


Date Maintenance Schedule
Feb 01 20254 years fee payment window open
Aug 01 20256 months grace period start (w surcharge)
Feb 01 2026patent expiry (for year 4)
Feb 01 20282 years to revive unintentionally abandoned end. (for year 4)
Feb 01 20298 years fee payment window open
Aug 01 20296 months grace period start (w surcharge)
Feb 01 2030patent expiry (for year 8)
Feb 01 20322 years to revive unintentionally abandoned end. (for year 8)
Feb 01 203312 years fee payment window open
Aug 01 20336 months grace period start (w surcharge)
Feb 01 2034patent expiry (for year 12)
Feb 01 20362 years to revive unintentionally abandoned end. (for year 12)