A method for determining an object in a surroundings of a motor vehicle includes: scanning a far range, which extends as of a predetermined minimum distance from the radar sensor, using a radar sensor for scanning the far range; detecting objects in the far range based on reflections of a radar signal emitted by the radar sensor; and determining a crossing object in a close range, which lies between the radar sensor and the far range, if a previously detected object is no longer able to be detected in the far range using the radar sensor.
|
1. A method for determining an object in a surroundings of a motor vehicle, comprising:
scanning a far range of the surroundings using a radar sensor for scanning the far range, wherein the far range begins at a predetermined minimum distance from the radar sensor;
detecting, for each of a plurality of time periods, objects in the far range based on reflections of a radar signal emitted by the radar sensor;
comparing, for at least one of the plurality of time periods, to at least one corresponding threshold value, at least one of: a proportion of a number of the detected objects in the far range in at least one previous time period still detected in the at least one time period, a sum over the at least one time period and a predetermined number of previous time periods of a number of objects no longer able to be detected in the far range relative to a number of objects detected in the far range for at least one previous time period, or a change in the number of the detected objects in the far range relative to a number of detected objects in the far range for at least one previous time period; and
determining a presence of a crossing object in a close range, which lies between the radar sensor and the far range, as a function of the comparing.
13. A device for determining an object in a surroundings of a motor vehicle, comprising:
a radar sensor for scanning a far range of the surroundings, wherein the far range begins at a predetermined minimum distance from the radar sensor; and
a processing device configured to:
detect, for each of a plurality of time periods, objects in the far range based on reflections of a radar signal emitted by the radar sensor;
compare, for at least one of the plurality of time periods, to at least one corresponding threshold value, at least one of: a proportion of a number of the detected objects in the far range in at least one previous time period still detected in the at least one time period, a sum over the at least one time period and a predetermined number of previous time periods of a number of objects no longer able to be detected in the far range relative to a number of detected objects in the far range for at least one previous time period, or a change in the number of the detected objects in the far range relative to a number of detected objects in the far range for at least one previous time period; and
determine a presence of a crossing object in a close range, which lies between the radar sensor and the far range, as a function of the comparing.
12. A non-transitory, machine-readable medium storing program instructions, which when executed by a processor cause the processor to perform a method for determining an object in a surroundings of a motor vehicle, the method comprising:
scanning a far range of the surroundings using a radar sensor for scanning the far range, wherein the far range begins at a predetermined minimum distance from the radar sensor;
detecting, for each of a plurality of time periods, objects in the far range based on reflections of a radar signal emitted by the radar sensor;
comparing, for at least one of the plurality of time periods, to at least one corresponding threshold value, at least one of: a proportion of a number of the detected objects in the far range in at least one previous time period still detected in the at least one time period, a sum over the at least one time period and a predetermined number of previous time periods of a number of objects no longer able to be detected in the far range relative to a number of detected objects in the far range for at least one previous time period, or a change in the number of the detected objects in the far range relative to a number of detected objects in the far range for at least one previous time period; and
determining a presence of a crossing object in a close range, which lies between the radar sensor and the far range, as a function of the comparing.
2. The method as recited in
3. The method as recited in
4. The method as recited in
5. The method as recited in
6. The method as recited in
7. The method as recited in
8. The method as recited in
9. The method as recited in
10. The method as recited in
11. The method as recited in
14. The device as recited in
15. The device as recited in
|
The present invention generally relates to a technique for determining an object using a radar sensor, and particularly relates to determining an object crossing in the close range ahead of a vehicle.
A motor vehicle has a radar sensor for detecting one or more objects. For this detection, the radar sensor emits a radar signal which is able to be reflected at an object and returned to the radar sensor. The reflected signal arriving at the radar sensor is correlated with the emitted signal, and the object is able to be detected. In this context, a distance and/or a speed of the object with respect to the radar sensor may preferably be determined. Such a system may be used particularly within the scope of a driver assistance system which, for example, is supposed to maintain a predetermined distance of the motor vehicle from a preceding motor vehicle. In another example, the system is able to be included in an assistance system for the autonomous or partially autonomous guidance of the motor vehicle.
The radar sensors used are usually not suitable for determining an object in a close range which extends a few meters from the radar sensor. To detect an object in the close range, a different sensor is normally used, for example, an ultrasonic sensor or a video camera. In spite of that, monitoring the close range, especially with respect to a movable object, which crosses the direction of motion of the motor vehicle may be desirable. For instance, a railroad engine driver on a locomotive is frequently not able to have a look at a region lying directly in front of the locomotive. The starting of the locomotive, while a person is crossing the track, cannot be prevented using a usual radar sensor. An assistant for starting travel of a motor vehicle, for instance, in stop-and-go traffic, also has to scan a close range in order to determine whether, for example, a preceding motor vehicle has left the route or not.
It is therefore the object of the present invention to provide a method, a computer program product and a device for determining an object in a close range in front of a motor vehicle, which function on the basis of a usual radar sensor.
A motor vehicle includes a radar sensor. The surroundings of the motor vehicle are subdivided into a far range, which extends as of a predetermined minimum distance from the radar sensor, and a close range which lies between the radar sensor and the far range. A method for determining an object crossing in the close range includes the steps of scanning the far range using the radar sensor, the detecting of objects in the far range, based on reflections of a radar signal emitted by the radar sensor, and determining the crossing object in the close range when a previously detected object in the far range is no longer able to be detected using the radar sensor.
The object is thus determined on the basis of its shadowing of other objects, even when the object itself is not able to be detected at all, such as with respect to its extension, its speed or the direction of its motion. By concluding that there is an object in the close range with the aid of the determinability of objects in the far range, it is possible to use a usual radar sensor below its physical or product-specific minimum range. A region of the surroundings of the motor vehicle in which an object is able to be determined, may thus also be enlarged without the aid of other sensors. The object is especially able to be determined if it is moving transversely or slantwise to an output direction of the radar signal. The output direction may coincide with a direction of motion of the motor vehicle. Thus, the object may be detected alternatively at the moving or the stationary motor vehicle.
The object may advantageously be detected in the close range already based on one single measurement. A plausibility check with the aid of several measurements one after another, as is usually required for detecting distant objects, may be omitted. Thus, the region near the motor vehicle that is particularly in danger of an accident may very quickly be investigated for crossing objects.
In one preferred specific embodiment, only objects in the far range are detected which are evaluated as being relevant for the motor vehicle. The relevance of each object in the far range may be yielded particularly by a measurement quality, a state of motion of the object or its position with respect to the motor vehicle. By rejecting certain objects as irrelevant, an improved determination may take place as to whether an object is able to be detected using the radar sensor. The spatial resolution of the method may thus be increased.
In one specific embodiment, the object is determined in the close range if the number of objects, that are no longer detectable in the far range having a predetermined speed, changes. This corresponds to looking at a differential proportion of the curve of the number of objects in the far range. A slow change in this number, as may take place during cornering, for example, is thus not able to lead to an erroneous determination of the object in the close range. The quality of the determination may thus be increased.
The object in the close range may also be determined if a predetermined proportion of the previously detected objects is no longer able to be detected. Thus a statistical evaluation of the objects in the far range may take place, from which one may draw conclusions on the object in the close range. A spatial distribution of the objects in the far range may remain without consideration, in this case. So, the determination of the object in the close range may thus be further improved.
The object in the close range may also be determined if the sum of the numbers of objects that are no longer detectable exceeds a predetermined threshold value over a predetermined number of past scans. This corresponds to looking at an integral proportion of the number of detectable objects. The object in the close range may thus be determined particularly early.
One may also look at a combination of the differential proportion, the integral proportion and the relative number of detectable objects, in order to determine the object in the close range as quickly and as selectively as possible.
Furthermore, a travel situation of the motor vehicle may be determined and the object in the close range may be determined with the aid of the travel situation. This corresponds to a situational evaluation, which may be carried out for scanning the surroundings of the motor vehicle, already for other reasons. In this context, particularly heuristic and experiential values may enter into the determination of the object, which may be derived from the travel situation.
A computer program product, according to the present invention includes program code means for carrying out the described method when the computer program product is run on a processing device or stored on a computer-readable data carrier.
A device, according to the present invention, for determining an object in the surroundings of a motor vehicle, includes a radar sensor for scanning a far range, which extends as of a predetermined minimum distance from the radar sensor, and a processing device for detecting objects in the far range, based on reflections of a radar signal output by the radar sensor. In this context, the processing device is prepared for determining a crossing object in a close range, which lies between the radar sensor and the far range, when a previously detected object in the far range is no longer able to be detected using the radar sensor.
In this context, the radar sensor preferably includes a frequency modulated continuous wave radar. Thereby, distances and relative speeds of distant objects are able to be determined quickly with high resolution.
The radar sensor is preferably prepared to scan objects in the far range, the far range beginning at a minimum distance of ca. 4 m from the radar sensor. Reflections of objects which are closer than the far range, are usually separated out already during the signal processing.
Radar sensor 115 is prepared to scan surroundings 130 of the motor vehicle. For this purpose, radar sensor 115 is prepared to output a radar signal along a longitudinal axis 135, which preferably coincides with a direction of motion of motor vehicle 105. In surroundings 130 of motor vehicle 105 there may be objects which reflect the radar signal back to radar sensor 115. In a far range 170, which extends from a predetermined minimum distance of ca. 4 m, for instance, from radar sensor 115, there are six exemplary objects 140 to 165. In a close range 175, which is between far range 170 and radar sensor 115, there is an additional object 180, which is represented as a person in exemplary fashion.
A distinction is generally made between a measurable object 180 and an object that may be followed. First of all, a reflected signal of object 180 has to be recorded in order to measure object 180. By evaluation a plurality of measurements that took place at different times, object 180 may then also be followed. The usual systems work only with objects 180 that are able to be followed. If an object 180 is located in the close range, measuring it may be possible, but following it is not possible. For example, an available measuring period may be too short to carry out a sufficient number of measurements of object 180. The measuring period is determined by a speed at which object 180 is moving with respect to radar sensor 115, and the speed of radar sensor 115 in the direction of the path of motion of object 180. One has to make certain that object 180 is detected in time, in order still to be able to carry out a measure for avoiding a collision successfully. In the surroundings of a motor vehicle 105, whose collision with a crossing pedestrian, for example, is to be prevented, the far range, which makes following possible, usually begins at a distance of ca. 4 m from radar sensor 115.
The detection range of radar sensor 115 may be restricted to objects which lie within a predetermined circular segment that includes longitudinal axis 135. Sixth object 165 lies outside the detection range, and may remain undetected by radar sensor 115. A further restriction for the detectability of objects may originate with their distance from radar sensor 115. A reflected radar signal of object 180 in close range 175 may not be able to be evaluated with measuring techniques or may be rejected for other reasons during processing, so that object 180 itself is not able to be detected using radar sensor 115, such as for determining its position or speed.
In order to determine that object 180 is crossing in close range 175, that is, that it is moving transversely or at an acute angle to longitudinal axis 135, the determination of objects 145 to 160 may take place cyclically and detected objects 145 to 160 may be followed up by filing specific data on these objects 145 to 160 in memory 125, for example. If object 180 moves laterally into close range 175, objects 140 to 150 which, as seen by radar sensor 115, lie behind object 180, are not able to be detected, although they are in far range 170. Based on the shadowing of objects 140 to 150, processing device 120 is able to conclude that object 180 is there. That is why a detection, based on measuring technology, of object 180 by radar sensor 115 is not required.
Preferably at regular intervals, in step 215, objects 140 to 160 in far range 170 are detected based on a correlation of the output radar signals with the reflected radar signals. The detection is able to include the providing of a plurality of data on respective object 140 to 160. For example, a measuring time, the amplitude of the reflected signal, a removal, an expansion, a speed, a position of the respective object with respect to radar sensor 115 or additional data may be determined. A representation of the respective object 140 to 160 may be stored in memory 125. This process may take place for each of objects 140 to 160. A corresponding representation of each object 140 to 160 may be stored in memory 125.
In an optional step 215, which may be integrated with step 210, it is checked whether one of objects 140 to 160 is irrelevant. The irrelevance of an object 140 to 160 may come about, for instance, from its direction of motion and speed of motion, its position with respect to motor vehicle 105 or a measuring quality which is able to be derived from the signal strength of the reflected signal. For objects 140 to 160, that are determined to be irrelevant, step 225 is carried out in which these objects 140 to 160 are rejected. All other objects 140 to 160 are not affected by this.
In a step 230 it is checked whether all objects 140 to 160, which were detected at a preceding run-through of method 200, especially of step 215, have again been detected in the current run-through. If this is the case, method 200 may branch back to step 205, and run through again. Otherwise it may be determined, optionally in a step 235, whether the shadowing of one of the objects 140 to 160 has taken place based on another object 140 to 160 in the far range. In the illustration of
Otherwise a travel situation of motor vehicle 105 may be determined in an optional step 240. A determination of object 180 in close range 175, carried out in a subsequent step 245, is then able to take place based on the travel situation. For this purpose, the covering of objects 140 to 160 in individual segments of the detection range of radar sensor 115, especially a directional angle with respect to longitudinal axis 135 and the distance from radar sensor 115, may be taken into account. In addition, situational characteristic variables, such as the number and spatial distribution of objects 140 to 160, an average distance or a measuring quality of these objects 140 to 160 may be observed. Parameters of motion of motor vehicle 105, particularly a direction of motion, a speed of motion and an acceleration may be taken into account. One or more of these indicators may be weighted as a function of the travel situation and compared to one another. The position of object 180 in close range 175 may thereby be determined more accurately.
In step 245, object 180 may also be determined, additionally or alternatively, based on a statistical evaluation.
A first curve 305 relates to a number of objects 140 to 160, which are detectable by radar sensor 115 in far range 170. At the beginning this number is 5, and then it goes down to 2, in the range of measuring periods T4 and T5, and subsequently rises again to 5. A second curve 310 relates to an integral proportion of first curve 305. Each deviation of curve 305 from the number of objects 140 to 160 in measuring period T0 is summed up. The integration time, that is, the number of previous measuring periods T0 to T9, which are taken into account for the integration, is at least 8 in the present illustration, so that a drop in curve 310 is not able to be observed, if the number of objects 140 to 160 in first curve 305 does not change.
A third curve 315 relates to a differential proportion of first curve 305. Third curve 315 has a value in each measuring period T0 to T9, which expresses by how much the value of first curve 305 differs in the same measuring period from its value in the preceding measuring period. A high value of third curve 315 indicates a rapid change in the number of detectable objects 140 to 160. A fourth curve 320 is reproduced numerically and indicates a relative proportion of detectable objects 140 to 160. For each measuring period T0 to T9, it is indicated how many of objects 140 to 160 are shadowed and how many are detectable without shadowing. The two values are in each case separated from each other by a slash.
Within the scope of a statistical evaluation, which in particular is able to be carried out in step 245 of method 200 of
Steinmetz, Alexander, van Uffelen, Raphael
Patent | Priority | Assignee | Title |
11289873, | Apr 09 2018 | SEYOND, INC | LiDAR systems and methods for exercising precise control of a fiber laser |
11391823, | Feb 21 2018 | SEYOND, INC | LiDAR detection systems and methods with high repetition rate to observe far objects |
11422234, | Feb 23 2018 | SEYOND, INC | Distributed lidar systems |
11422267, | Feb 18 2021 | SEYOND, INC | Dual shaft axial flux motor for optical scanners |
11460554, | Oct 19 2017 | SEYOND, INC | LiDAR with large dynamic range |
11493601, | Dec 22 2017 | SEYOND, INC | High density LIDAR scanning |
11555895, | Apr 20 2021 | SEYOND, INC | Dynamic compensation to polygon and motor tolerance using galvo control profile |
11567182, | Mar 09 2018 | SEYOND, INC | LiDAR safety systems and methods |
11567184, | Aug 23 2018 | Electronics and Telecommunications Research Institute; PUKYONG NATIONAL UNIVERSITY INDUSTRY-UNIVERSITY COOPERATION FOUNDATION | Method for detecting multiple targets using radar and apparatus for the same |
11567213, | Feb 18 2021 | SEYOND, INC | Dual shaft axial flux motor for optical scanners |
11569632, | Apr 09 2018 | SEYOND, INC | Lidar systems and methods for exercising precise control of a fiber laser |
11579258, | Aug 30 2018 | SEYOND, INC | Solid state pulse steering in lidar systems |
11579300, | Aug 21 2018 | SEYOND, INC | Dual lens receive path for LiDAR system |
11604279, | Jan 05 2017 | SEYOND, INC | MEMS beam steering and fisheye receiving lens for LiDAR system |
11609336, | Aug 21 2018 | SEYOND, INC | Refraction compensation for use in LiDAR systems |
11614513, | Mar 12 2021 | Ford Global Technologies, LLC | Battery-powered vehicle sensors |
11614521, | Apr 21 2021 | SEYOND, INC | LiDAR scanner with pivot prism and mirror |
11614526, | Aug 24 2018 | SEYOND, INC | Virtual windows for LIDAR safety systems and methods |
11624806, | May 12 2021 | SEYOND, INC | Systems and apparatuses for mitigating LiDAR noise, vibration, and harshness |
11644543, | Nov 14 2018 | SEYOND, INC | LiDAR systems and methods that use a multi-facet mirror |
11662439, | Apr 22 2021 | SEYOND, INC | Compact LiDAR design with high resolution and ultra-wide field of view |
11662440, | May 21 2021 | SEYOND, INC | Movement profiles for smart scanning using galvonometer mirror inside LiDAR scanner |
11675050, | Jan 09 2018 | SEYOND, INC | LiDAR detection systems and methods |
11675053, | Jun 15 2018 | SEYOND, INC | LiDAR systems and methods for focusing on ranges of interest |
11675055, | Jan 10 2019 | SEYOND, INC | LiDAR systems and methods with beam steering and wide angle signal detection |
11686824, | Nov 14 2018 | SEYOND, INC | LiDAR systems that use a multi-facet mirror |
11762065, | Feb 11 2019 | SEYOND, INC | Multiple beam generation from a single source beam for use with a lidar system |
11768294, | Jul 09 2021 | SEYOND, INC | Compact lidar systems for vehicle contour fitting |
11774565, | Jan 05 2017 | High resolution LiDAR using high frequency pulse firing | |
11782131, | Dec 31 2016 | SEYOND, INC | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
11782132, | Dec 31 2016 | SEYOND, INC | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
11782136, | Jun 15 2018 | Innovusion, Inc. | LiDAR systems and methods for focusing on ranges of interest |
11782138, | Feb 21 2018 | SEYOND, INC | LiDAR detection systems and methods with high repetition rate to observe far objects |
11789128, | Mar 01 2021 | SEYOND, INC | Fiber-based transmitter and receiver channels of light detection and ranging systems |
11789132, | Apr 09 2018 | SEYOND, INC | Compensation circuitry for lidar receiver systems and method of use thereof |
11796645, | Aug 24 2018 | SEYOND, INC | Systems and methods for tuning filters for use in lidar systems |
11808888, | Feb 23 2018 | SEYOND, INC | Multi-wavelength pulse steering in LiDAR systems |
11860313, | Jun 15 2018 | SEYOND, INC | LiDAR systems and methods for focusing on ranges of interest |
11860316, | Aug 21 2018 | SEYOND, INC | Systems and method for debris and water obfuscation compensation for use in LiDAR systems |
11871130, | Mar 25 2022 | SEYOND, INC | Compact perception device |
11899134, | Dec 31 2016 | SEYOND, INC | 2D scanning high precision lidar using combination of rotating concave mirror and beam steering devices |
11912235, | Mar 12 2021 | Ford Global Technologies, LLC | Vehicle object detection |
11914076, | Aug 30 2018 | SEYOND, INC | Solid state pulse steering in LiDAR systems |
11916420, | Mar 12 2021 | Ford Global Technologies, LLC | Vehicle sensor operation |
11921234, | Feb 16 2021 | SEYOND, INC | Attaching a glass mirror to a rotating metal motor frame |
11927696, | Feb 21 2018 | SEYOND, INC | LiDAR systems with fiber optic coupling |
11940570, | Aug 24 2018 | SEYOND, INC | Virtual windows for LiDAR safety systems and methods |
11947047, | Jan 05 2017 | SEYOND, INC | Method and system for encoding and decoding LiDAR |
11951937, | Mar 12 2021 | Ford Global Technologies, LLC | Vehicle power management |
11953586, | Nov 17 2020 | Ford Global Technologies, LLC | Battery-powered vehicle sensors |
11953601, | Dec 30 2016 | SEYOND, INC | Multiwavelength lidar design |
11965980, | Jan 09 2018 | SEYOND, INC | Lidar detection systems and methods that use multi-plane mirrors |
11977183, | Dec 31 2016 | SEYOND, INC | 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices |
11977184, | Jan 09 2018 | SEYOND, INC | LiDAR detection systems and methods that use multi-plane mirrors |
11977185, | Apr 04 2019 | SEYOND, INC | Variable angle polygon for use with a LiDAR system |
11988773, | Feb 23 2018 | SEYOND, INC | 2-dimensional steering system for lidar systems |
12061289, | Feb 16 2021 | SEYOND, INC | Attaching a glass mirror to a rotating metal motor frame |
12072447, | Apr 22 2021 | SEYOND, INC | Compact LiDAR design with high resolution and ultra-wide field of view |
12078755, | Jan 09 2018 | SEYOND, INC | LiDAR detection systems and methods that use multi-plane mirrors |
12085673, | Feb 23 2018 | SEYOND, INC | Distributed LiDAR systems |
12146988, | Apr 20 2021 | SEYOND, INC | Dynamic compensation to polygon and motor tolerance using galvo control profile |
12158545, | Jan 10 2019 | SEYOND, INC | Lidar systems and methods with beam steering and wide angle signal detection |
ER4008, | |||
ER546, | |||
ER7025, | |||
ER71, |
Patent | Priority | Assignee | Title |
5689264, | Oct 05 1994 | Mazda Motor Corporation | Obstacle detecting system for vehicles |
6085151, | Jan 17 1997 | Joyson Safety Systems Acquisition LLC | Predictive collision sensing system |
6970129, | Dec 24 2002 | VIDEORAY, LLC | Radar apparatus, and program therefor |
7605746, | Nov 07 2007 | OMRON AUTOMOTIVE ELECTRONICS CO , LTD | In-vehicle radar device |
8933834, | Nov 10 2010 | Fujitsu Ten Limited | Radar device |
9261590, | Feb 27 2013 | GOOGLE LLC | Adaptive algorithms for interrogating the viewable scene of an automotive radar |
20060289216, | |||
20080186224, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 31 2014 | Robert Bosch GmbH | (assignment on the face of the patent) | / | |||
Sep 08 2014 | UFFELEN, RAPHAEL VAN | Robert Bosch GmbH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034672 | /0717 | |
Sep 23 2014 | STEINMETZ, ALEXANDER | Robert Bosch GmbH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034672 | /0717 |
Date | Maintenance Fee Events |
Jul 26 2021 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Jan 30 2021 | 4 years fee payment window open |
Jul 30 2021 | 6 months grace period start (w surcharge) |
Jan 30 2022 | patent expiry (for year 4) |
Jan 30 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 30 2025 | 8 years fee payment window open |
Jul 30 2025 | 6 months grace period start (w surcharge) |
Jan 30 2026 | patent expiry (for year 8) |
Jan 30 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 30 2029 | 12 years fee payment window open |
Jul 30 2029 | 6 months grace period start (w surcharge) |
Jan 30 2030 | patent expiry (for year 12) |
Jan 30 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |