A method for sensing a traffic environment for use in an electronic device is provided. The method includes: generating local object information by sensing an environment within a first sensing range of the electronic device, wherein the local object information at least includes first geographical distribution information of local objects within the first sensing range; receiving external object information transmitted by at least one node, wherein the external object information includes at least second geographical distribution information of external objects within a second sensing range of the node; and generating object integration information according to the local object information and the external object information.
|
1. A method for sensing a traffic environment, used in an electronic device, comprising:
generating, by a sensor of the electronic device, local object information by sensing an environment within a first sensing range of the electronic device, wherein the local object information at least comprises first geographical distribution information of local objects within the first sensing range, wherein the first sensing range is a range centered on the electronic device;
receiving external object information transmitted by at least one node, wherein the external object information comprises at least second geographical distribution information of external objects within a second sensing range of the node; and
generating object integration information according to the local object information and the external object information,
wherein the local object information comprises first absolute position data of the electronic device, and the external object information comprises second absolute position data of the node,
wherein the step of generating object integration information according to the local object information and the external object information comprises:
obtaining absolute position data of the local objects and absolute position data of the external objects according to the first absolute position data, the second absolute position data, the first geographical distribution information and the second geographical distribution information;
determining whether the absolute position data of the local objects are the same as the absolute position data of the external objects; and
integrating the local object information and the external object information to generate object integration information when the absolute position data of the local objects are not the same as the absolute position data of the external object.
10. A device for sensing a traffic environment, comprising:
one or more processors; and
one or more computer storage media for storing one or more computer-readable instructions, wherein the processor is configured to drive the computer storage media to execute the following tasks:
generating, by a sensor of the device, local object information by sensing an environment within a first sensing range of the device, wherein the local object information at least includes first geographical distribution information of local objects within the first sensing range, wherein the first sensing range is a range centered on the device;
receiving external object information transmitted by at least one node, wherein the external object information includes at least second geographical distribution information of external objects within a second sensing range of the node; and
generating object integration information according to the local object information and the external object information,
wherein the local object information comprises first absolute position data of the device, and the external object information comprises second absolute position data of the node,
wherein generating object integration information according to the local object information and the external object information by the processor comprises:
obtaining absolute position data of the local objects and absolute position data of the external objects according to the first absolute position data, the second absolute position data, the first geographical distribution information and the second geographical distribution information;
determining whether the absolute position data of the local objects are the same as the absolute position data of the external objects; and
integrating the local object information and the external object information to generate object integration information when the absolute position data of the local objects are not the same as from the absolute position data of the external object.
2. The method for sensing a traffic environment as claimed in
3. The method for sensing a traffic environment as claimed in
4. The method for sensing a traffic environment as claimed in
5. The method for sensing a traffic environment as claimed in
determining whether a difference between the local timestamp and the external timestamp is greater than an update period; and
deleting the external object information when the difference is greater than the update period.
6. The method for sensing a traffic environment as claimed in
7. The method for sensing a traffic environment as claimed in
8. The method for sensing a traffic environment as claimed in
broadcasting the object integration information.
9. The method for sensing a traffic environment as claimed in
11. The device for sensing a traffic environment as claimed in
12. The device for sensing a traffic environment as claimed in
13. The device for sensing a traffic environment as claimed in
14. The device for sensing a traffic environment as claimed in
determining whether a difference between the local timestamp and the external timestamp is greater than an update period; and
deleting the external object information when the difference is greater than the update period.
15. The device for sensing a traffic environment as claimed in
16. The device for sensing a traffic environment as claimed in
17. The device for sensing a traffic environment as claimed in
broadcasting the object integration information.
18. The device for sensing a traffic environment as claimed in
|
The present application claims priority from U.S. Provisional Application filed on Nov. 21, 2018 in the United States Patent and Trademark Office and assigned Ser. Nos. 62/770,369, and from Taiwan Patent Application No. 108116665, filed on May 15, 2019, the entirety of which are incorporated herein by reference.
The disclosure relates to a method and a device for sensing the traffic environment. Specifically, the present disclosure relates to a method and a device for sensing the traffic environment using Road Side Units (RSUs) to sense the traffic environment.
How to improve driving safety has always been of interest to the automobile industry. Many manufacturers have developed video cameras, radar imaging, LIDAR, and ultrasonic sensors to detect obstacles around a vehicle to inform drivers of road conditions.
However, a camera or radar mounted on a vehicle can generally only monitor an area in one or a few directions. When the vehicle is turning or in a blind spot, the camera cannot capture the locations of other vehicles, and radar monitoring cannot obtain information about the locations of vehicles in the unknown blind spots due to obstruction by obstacles. In this way, a blank area that the camera or radar cannot perceive may pose a threat to the safety of the vehicle or pose a risk of collision, thereby reducing the safety of said vehicle.
Thus, a method and a device for sensing the traffic environment are desired to minimize the disadvantages and improve driving safety.
In an exemplary embodiment, a method for sensing the traffic environment is provided in the disclosure. The method comprises: generating local object information by sensing an environment within a first sensing range of the electronic device, wherein the local object information at least comprises first geographical distribution information of the local objects within the first sensing range; receiving external object information transmitted by at least one node, wherein the external object information comprises at least second geographical distribution information of external objects within a second sensing range of the node; and generating object integration information according to the local object information and the external object information.
In some exemplary embodiments, the local object information further comprises an identifier of the electronic device and first absolute position data of the electronic device, and the external object information further comprises an identifier of the node and second absolute position data of the node.
In some exemplary embodiments, the first comprises distribution information comprises relative position data of the local objects relative to the electronic device, and the second comprises distribution information comprises relative position data of the external objects relative to the node.
In some exemplary embodiments, the step of generating object integration information according to the local object information and the external object information comprises: obtaining absolute position data of the local objects and absolute position data of the external objects according to the first absolute position data, the second absolute position data, the first geographical distribution information and the second geographical distribution information; determining whether the absolute position data of the local objects are the same as the absolute position data of the external objects; and integrating the local object information and the external object information to generate object integration information when the absolute position data of the local objects are not the same as the absolute position data of the external object.
In some exemplary embodiments, the local object information further comprises a local time stamp, and the external object information further comprises an external time stamp.
In some exemplary embodiments, the step of generating object integration information according to the local object information and the external object information comprises: determining whether a difference between the local timestamp and the external timestamp is greater than an update period; and deleting the external object information when the difference is greater than the update period.
In some exemplary embodiments, the update period is a time interval to re-generate the local object information by the electronic device.
In some exemplary embodiments, the electronic device is a vehicle device.
In some exemplary embodiments, the electronic device is a road side unit (RSU), and the method further comprises broadcasting the object integration information.
In some exemplary embodiments, the node is a road side unit (RSU) or a vehicle device.
In an exemplary embodiment, a device for sensing a traffic environment is provided. The device comprises one or more processors and one or more computer storage media for storing one or more computer-readable instructions. The processor is configured to drive the computer storage media to execute the following tasks: generating local object information by sensing an environment within a first sensing range of the device, wherein the local object information at least includes first geographical distribution information of local objects within the first sensing range; receiving external object information transmitted by at least one node, wherein the external object information includes at least second geographical distribution information of external objects within a second sensing range of the node; and generating object integration information according to the local object information and the external object information.
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of the present disclosure. The drawings illustrate implementations of the disclosure and, together with the description, serve to explain the principles of the disclosure. It should be appreciated that the drawings are not necessarily to scale as some components may be shown out of proportion to the size in actual implementation in order to clearly illustrate the concept of the present disclosure.
Various aspects of the disclosure are described more fully below with reference to the accompanying drawings. This disclosure may, however, be embodied in many different forms and should not be construed as limited to any specific structure or function presented throughout this disclosure. Rather, these aspects are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art. Based on the teachings herein one skilled in the art should appreciate that the scope of the disclosure is intended to cover any aspect of the disclosure disclosed herein, whether implemented independently of or combined with any other aspect of the disclosure. For example, an apparatus may be implemented or a method may be practiced using number of the aspects set forth herein. In addition, the scope of the disclosure is intended to cover such an apparatus or method which is practiced using other structure, functionality, or structure and functionality in addition to or other than the various aspects of the disclosure set forth herein. It should be understood that any aspect of the disclosure disclosed herein may be embodied by one or more elements of a claim.
The word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any aspect described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other aspects. Furthermore, like numerals refer to like elements throughout the several views, and the articles “a” and “the” includes plural references, unless otherwise specified in the description.
It should be understood that when an element is referred to as being “connected” or “coupled” to another element, it may be directly connected or coupled to the other element or intervening elements may be present. In contrast, when an element is referred to as being “directly connected” or “directly coupled” to another element, there are no intervening elements present. Other words used to describe the relationship between elements should be interpreted in a like fashion. (e.g., “between” versus “directly between”, “adjacent” versus “directly adjacent”, etc.).
Each of the RSUs 100A, 110B and 110C can periodically sense an environment within a specific sensing range of each of the RSUs by a sensor (for example, a camera, a radar, or a light sensor) to generate local object information, and the vehicle device 120 can also periodically sense an environment within a specific sensing range of the vehicle device 120 by using a sensor (for example, a camera, a radar, or a light sensor) to generate local object information, as shown in
Taking the RSU 110A as an example, the RSU 110A can sense the environment within the first sensing range 110a of the RSU 110A and generate first local object information, wherein the first local object information comprises an identifier of the RSU 110A and absolute position data, a local time stamp, and first geographical distribution information of the local objects A1, A2, A3 and 130 within the first sensing range 110a. The local timestamp is the time at which the first local object information is generated. As shown in
Taking the RSU 110B as an example, the RSU 110B can sense the environment within the second sensing range 110b of the RSU 110B and generate second local object information, wherein the second local object information comprises an identifier of the RSU 110B and absolute position data, a local time stamp, and second geographical distribution information of the local objects B1, B2, B3 and 130 within the second sensing range 110b. The local timestamp is the time at which the second local object information is generated. As shown in
Taking the RSU 110C as an example, the RSU 110C can sense the environment within the third sensing range 110c of the RSU 110C and generate third local object information, wherein the third local object information comprises an identifier of the RSU 110C and absolute position data, a local time stamp, and third geographical distribution information of the local objects 130, 133 and 134 within the third sensing range 110c. The local timestamp is the time at which the third local object information is generated. As shown in
Taking the vehicle device 120 as an example, the vehicle device 120 can sense the environment within the fourth sensing range 120a of the vehicle device 120 and generate fourth local object information, wherein the fourth local object information comprises an identifier of the vehicle device 120 and absolute position data, a local time stamp, and fourth geographical distribution information of the local objects 131 and 132 within the third sensing range 110c. The local timestamp is the time at which the fourth local object information is generated. As shown in
When each device (the RSU 110A, 110B, 110C or the vehicle device 120) generates its own local object information, each device broadcasts the local object information. Illustratively, the respective object information generated by each device (the RSU 110A, 110B, 110C or the vehicle device 120) is called the local object information. The object information received by a device from other devices broadcasting the object information is called the external object information. For example, the RSU 110A generates and broadcasts the first local object information. The RSU 110B receives the first local object information broadcasted by the RSU 110A. For the RSU 110B, the first local object information is regarded as the external object information. The object information generated by the RSU 110B is called the local object information.
When a device (one of the RSU 110A, 110B, 110C or the vehicle device 120) receives the external object information broadcasted by other devices, the device can generate object integration information according to the local object information and the external object information, and broadcasts the object integration information. In an exemplary embodiment, the object integration information may further comprise a field, wherein the field records the object integration information is integrated by which device's object information.
In an exemplary embodiment, the vehicle device may broadcast the traveling direction of the vehicle device. After the RSU receives the traveling direction, the RSU may determine whether the local object is located in a free space along the traveling direction of the vehicle device. When a part of the local object is not located within the free space, the RSU may mark the object not located within the free space as a non-critical object. For example, as shown in
It should be understood that the RSUs 110A, 110B, 110C and the vehicle device 120 shown in
In step S205, the electronic device generates local object information by sensing an environment within the first sensing range of the electronic device, wherein the local object information at least comprises first geographical distribution information of local objects within the first sensing range. In an exemplary embodiment, the local object information further comprises an identifier of the electronic device, first absolute position data of the electronic device and a local timestamp, and the first geographical distribution information comprises relative position data of the local objects relative to the electronic device.
Next, in step S210, the electronic device receives external object information transmitted by at least one node, wherein the external object information comprises at least second geographical distribution information of external objects within a second sensing range of the node. In an exemplary embodiment, the external object information further comprises an identifier of the node, second absolute position data of the node and an external time stamp, and the second position distribution information comprises relative position data of the external objects relative to the node.
In step S215, the electronic device generates object integration information according to the local object information and the external object information. In an exemplary embodiment, the electronic device and the node are a RSU or a vehicle device. In another exemplary embodiment, when the electronic device is a RSU, the electronic device further broadcasts the object integration information after the step S215 is performed.
The following may explain in detail how the electronic device generates the object integration information according to the local object information and the external object information in step S215.
In step S305, the electronic device determines whether the difference between the local timestamp and the external timestamp is greater than an update period, wherein the update period is a time interval to re-generate the local object information by the electronic device. When the difference is not greater than the update period (“No” in step S305), in step S310, the electronic device obtains the absolute position data of the local objects are not the same as the absolute position data of the external objects according to the first absolute position data, the second absolute position data, the first geographical distribution information and the second geographical distribution information. Specifically, the electronic device may unify the coordinate systems between the electronic device and the node by using the Real Time Kinematics (RTK) of the carrier phase information of the GPS signal to obtain the absolute position data of the local objects and the absolute position data of the external objects.
Next, in step S315, the electronic device determines whether the absolute location data of the local objects is the same as the absolute location data of the external objects. In an exemplary embodiment, when the distance between the center of the position of the local object and the center of the position of the external object is less than a first predetermined value (e.g., 0.5 meters) and the height difference between the height of the local object and the height of the external object is less than a second predetermined value (e.g., 0.1 meters), the electronic device determines that the absolute position data of the local object is the same as the absolute position data of the external object. In other words, the electronic device determines that the local object and the external object are the same object. In another exemplary embodiment, the electronic device may also use a 3D algorithm to determine whether the absolute location data of the local object is the same as the absolute location data of the external object. Exemplary 3D algorithms can use surface vertex features to determine whether the seams between the local object and the external object are smooth, compare the local object and the external object by suing the features of distribution histogram, project the data of the local object and the external object into a 2D plane, obtains a shell to determine whether the seams are reasonable by using a convex hull, learn by neural networks, and determine whether the local object and the external object belong to the same group by using clustering to determine whether the local object and the external object are the same object.
When the electronic device determines that the absolute location data of the local object is not the same as the absolute location data of the external object (“No” in step S315), in step S320, the electronic device integrates the local object information and the external object information to generate object integration information. Specifically, the electronic device stitches the local object information and the external object information to generate the object integration information, wherein the object integration information is final information generated by combining the scenes sensed by the electronic device and the node, wherein the final information is a wide range of final image.
Returning to step S305, when the difference is greater than the update period (“Yes” in step S305), in step S325, the electronic device deletes the external object information. In other words, the external object information may not conform to the current situation, and therefore the electronic device does not use the external object information.
Returning to step S315, when the electronic device determines that the absolute position data of the local object is the same as the absolute position data of the external object (“Yes” in step S315), in step S330, the electronic device does not integrate the local object information and the external object information. In other words, the external object information sensed by the node may be the same as the local object information sensed by the electronic device, and therefore the electronic device does nothing.
As described above, through the method and the device for sensing the traffic environment provided in the disclosure, the vehicle device can obtain the blind spots in multiple directions by obtaining the object integration information stitched by the RSUs to improve the driving safety of the vehicle.
Having described exemplary embodiments of the present disclosure, an exemplary operating environment in which exemplary embodiments of the present disclosure may be implemented is described below. Referring to
The disclosure may be realized by means of the computer code or machine-useable instructions, including computer-executable instructions such as program modules, being executed by a computer or other machine, such as a personal data assistant (PDA) or other handheld device. Generally, program modules may include routines, programs, objects, components, data structures, etc., and refer to code that performs particular tasks or implements particular abstract data types. The disclosure may be implemented in a variety of system configurations, including hand-held devices, consumer electronics, general-purpose computers, more specialty computing devices, etc. The disclosure may also be implemented in distributed computing environments where tasks are performed by remote-processing devices that are linked by a communication network.
With reference to
The electronic device 500 typically includes a variety of computer-readable media. The computer-readable media can be any available media that can be accessed by electronic device 500 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, but not limitation, computer-readable media may comprise computer storage media and communication media. The computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. The computer storage media may include, but not limit to, random access memory (RAM), read-only memory (ROM), electrically-erasable programmable read-only memory (EEPROM), flash memory or other memory technology, compact disc read-only memory (CD-ROM), digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the electronic device 500. The computer storage media may not comprise signal per se.
The communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, but not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media or any combination thereof.
The memory 512 may include computer-storage media in the form of volatile and/or nonvolatile memory. The memory may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid-state memory, hard drives, optical-disc drives, etc. The electronic device 500 includes one or more processors that read data from various entities such as the memory 512 or the I/O components 520. The presentation component(s) 516 present data indications to a user or other device. Exemplary presentation components include a display device, speaker, printing component, vibrating component, etc.
The I/O ports 518 allow the electronic device 500 to be logically coupled to other devices including the I/O components 520, some of which may be embedded. Illustrative components include a microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, etc. The I/O components 520 may provide a natural user interface (NUI) that processes gestures, voice, or other physiological inputs generated by a user. For example, inputs may be transmitted to an appropriate network element for further processing. The electronic device 500 may be equipped with depth cameras, such as stereoscopic camera systems, infrared camera systems, RGB camera systems, any combination of thereof to realize object detection and recognition. In addition, the electronic device 500 may be equipped with a sensor (e.g., radar or LIDAR) to periodically sense the neighboring environment within a sensing range, and generating sensor information indicating that the electronic device itself being associated with the surrounding environment. Furthermore, the electronic device 500 may be equipped with accelerometers or gyroscopes that enable detection of motion. The output of the accelerometers or gyroscopes may be provided to the display of the electronic device 500 to display.
Furthermore, the processor 514 in the electronic device 500 can execute the program code in the memory 512 to perform the above-described actions and steps or other descriptions herein.
It should be understood that any specific order or hierarchy of steps in any disclosed process is an example of a sample approach. Based upon design preferences, it should be understood that the specific order or hierarchy of steps in the processes may be rearranged while remaining within the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not meant to be limited to the specific order or hierarchy presented.
Use of ordinal terms such as “first,” “second,” “third,” etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having the same name (but for use of the ordinal term) to distinguish the claim elements.
While the disclosure has been described by way of example and in terms of the exemplary embodiments, it should be understood that the disclosure is not limited to the disclosed exemplary embodiments. On the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Yang, Chung-Hsien, Tu, Ming-Ta, Tsai, Ping-Ta, Jeng, An-Kai
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
6502033, | Oct 05 2000 | HERE GLOBAL B V | Turn detection algorithm for vehicle positioning |
6734807, | Apr 01 1999 | Lear Automotive Dearborn, Inc | Polarametric blind spot detector with steerable beam |
7447592, | Oct 18 2004 | Ford Global Technologies, LLC | Path estimation and confidence level determination system for a vehicle |
8315756, | Aug 24 2009 | Toyota Motor Corporation | Systems and methods of vehicular path prediction for cooperative driving applications through digital map and dynamic vehicle model fusion |
9383753, | Sep 26 2012 | GOOGLE LLC | Wide-view LIDAR with areas of special attention |
9666067, | Aug 30 2016 | Arity International Limited | Vehicle turn detection |
9834216, | May 03 2002 | MAGNA ELECTRONICS INC. | Vehicular control system using cameras and radar sensor |
20130079990, | |||
20130289824, | |||
20140307087, | |||
20140324312, | |||
20160077166, | |||
20160300486, | |||
20170076599, | |||
20170092126, | |||
20170200374, | |||
20170238258, | |||
20180218596, | |||
20190114921, | |||
20190349389, | |||
20200111363, | |||
20200361493, | |||
CN101286266, | |||
CN104376735, | |||
CN108010360, | |||
CN108284838, | |||
JP2006195641, | |||
JP2011242846, | |||
TW201020140, | |||
TW201501979, | |||
TW403461, | |||
TW485173, | |||
WO2017030492, | |||
WO2018105571, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 10 2019 | TU, MING-TA | Industrial Technology Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 049878 | /0948 | |
Jul 10 2019 | TSAI, PING-TA | Industrial Technology Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 049878 | /0948 | |
Jul 10 2019 | YANG, CHUNG-HSIEN | Industrial Technology Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 049878 | /0948 | |
Jul 10 2019 | JENG, AN-KAI | Industrial Technology Research Institute | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 049878 | /0948 | |
Jul 24 2019 | Industrial Technology Research Institute | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jul 24 2019 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Jun 28 2025 | 4 years fee payment window open |
Dec 28 2025 | 6 months grace period start (w surcharge) |
Jun 28 2026 | patent expiry (for year 4) |
Jun 28 2028 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 28 2029 | 8 years fee payment window open |
Dec 28 2029 | 6 months grace period start (w surcharge) |
Jun 28 2030 | patent expiry (for year 8) |
Jun 28 2032 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 28 2033 | 12 years fee payment window open |
Dec 28 2033 | 6 months grace period start (w surcharge) |
Jun 28 2034 | patent expiry (for year 12) |
Jun 28 2036 | 2 years to revive unintentionally abandoned end. (for year 12) |