Provided is a route determination device including a recognition unit that recognizes peripheral circumstances of a host vehicle (121); and an evaluation unit (123C) that evaluates each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on peripheral circumstances of the host vehicle recognized by the recognition unit, and selects one or more routes from the plurality of routes based on an evaluation result. Each of the plurality of routes is generated by joining at least two of a plurality of edges. Each of the plurality of edges is generated by connecting two virtual nodes among a plurality of virtual nodes. The plurality of virtual nodes is located with space in each of a forward moving direction and a road width direction.
|
9. A route determination device comprising a processor configured to:
recognize peripheral circumstances of a host vehicle;
evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle;
select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predetermined distance from adjacent virtual nodes of the plurality of virtual nodes in each of a forward moving direction and a road width direction; and
determine that the host vehicle is to follow a preceding vehicle in a case that a sum of the costs of all of the plurality of routes exceeds a criterion.
13. A route determination device comprising a processor configured to:
recognize peripheral circumstances of a host vehicle;
evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle;
select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predetermined distance from adjacent virtual nodes of the plurality of virtual nodes in each of a forward moving direction and a road width direction;
recognize a direction in which an object on a periphery of the host vehicle moves; and
determine a cost to be applied to an edge on the periphery of the object, based on the direction.
11. A route determination device comprising a processor configured to:
recognize peripheral circumstances of a host vehicle;
evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle;
select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predetermined distance from adjacent virtual nodes of the plurality of virtual nodes in each of a forward moving direction and a road width direction;
recognize a position and a size of an object on a periphery of the host vehicle; and
increase a cost to be applied to an edge on the periphery of the object in proportion to an increase in a width of the recognized object.
15. A route determination method using a computer mounted in a host vehicle, comprising:
recognizing peripheral circumstances of the host vehicle;
evaluating each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle; and
selecting one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predefined distance from one another in each of a forward moving direction and a road width direction,
wherein the recognizing includes recognizing a position and a state of an object on a periphery of the host vehicle, and
the evaluating includes applying a cost to each of the plurality of edges based on the future position of the object for evaluating each of a plurality of routes.
16. A non-transitory computer-readable storage medium which stores a program for causing a computer mounted in a host vehicle:
to recognize peripheral circumstances of the host vehicle;
to evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle; and
to select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, the plurality of virtual nodes being located a predetermined distance from one another in each of a forward moving direction and a road width direction,
wherein the recognition includes recognizing a position and a state of an object on a periphery of the host vehicle, and
the evaluation includes applying a cost to each of the plurality of edges based on the future position of the object for evaluating each of a plurality of routes.
5. A route determination device comprising a processor configured to:
recognize peripheral circumstances of a host vehicle;
evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle;
select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predetermined distance from adjacent virtual nodes of the plurality of virtual nodes in each of a forward moving direction and a road width direction;
recognize a position and a state of an object on a periphery of the host vehicle,
estimate a future position of the object based on the position and the state of the object; and
apply a cost to each of the plurality of edges based on the future position of the object for evaluating each of a plurality of routes.
1. A route determination device comprising a processor configured to:
recognize peripheral circumstances of a host vehicle;
evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle;
select one or more routes from the plurality of routes based on an evaluation result, each of the plurality of routes being generated by joining at least two edges of the plurality of edges, and each of the plurality of edges being generated by connecting two adjacent virtual nodes among a plurality of virtual nodes, virtual nodes of the plurality of virtual nodes being located a predetermined distance from adjacent virtual nodes of the plurality of virtual nodes in each of a forward moving direction and a road width direction;
recognize a position of an object on a periphery of the host vehicle; and
increase a cost to be applied to an edge corresponding to another lane on a side toward which the object is biased, in a case that the position of the object recognized by the recognition unit is biased from a center of a lane, for evaluating each of a plurality of routes.
2. The route determination device according to
wherein the plurality of virtual nodes are located for each lane in the road width direction.
3. The route determination device according to
wherein the processor individually evaluates one or more routes in which the host vehicle is able to arrive at a target lane at a trailing end of an evaluation section in a case that the target lane at the trailing end of the evaluation section is set.
4. A vehicle control device comprising:
the route determination device according to
a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
6. The route determination device according to
wherein the plurality of virtual nodes are located for each lane in the road width direction.
7. The route determination device according to
wherein the processor individually evaluates one or more routes in which the host vehicle is able to arrive at an a target lane at a trailing end of an evaluation section in a case that the target lane at the trailing end of the evaluation section is set.
8. A vehicle control device comprising:
the route determination device according to
a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
10. A vehicle control device comprising:
the route determination device according to
a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
12. A vehicle control device comprising:
the route determination device according to
a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
14. A vehicle control device comprising:
the route determination device according to
a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
|
The present invention relates to a route determination device, a vehicle control device, a route determination method, and a storage medium.
In the related art, Japanese Patent No. 5614055 discloses a driving support device which estimates a future moving state of a moving body on the periphery of a host vehicle and performs driving support based on the future moving state. The device includes moving state acquiring means for acquiring a current moving state of a moving body, and moving state estimating means for estimating a presence probability distribution of the moving body after a predetermined time based on a current moving state of the moving body acquired by the moving state acquiring means. The device individually calculates presence probability of an obstacle and presence probability of the host vehicle with respect to areas demarcated on a grid, and ultimately performs control including automated steering.
According to the technology in the related art, there may be cases in which a computation load is high and a real-time response cannot be performed in a traveling situation of a vehicle. Since movement of a peripheral object in a traveling situation of a vehicle changes from moment to moment, it is sometimes desirable to primarily perform a calculation on a broad scale.
The present invention has been made in consideration of such circumstances, and an object thereof is to provide a route determination device which is able to determine a route more quickly and appropriately, a vehicle control device, a route determination method, and a storage medium.
A vehicle control system, a vehicle control method, and a storage medium according to the invention employ the following configurations.
(1): There is provided a route determination device including a recognition unit that recognizes peripheral circumstances of a host vehicle, and an evaluation unit that evaluates each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle recognized by the recognition unit, and selects one or more routes from the plurality of routes based on an evaluation result. Each of the plurality of routes is generated by joining at least two of a plurality of edges. Each of the plurality of edges is generated by connecting two virtual nodes among a plurality of virtual nodes. The plurality of virtual nodes is located with space in each of a forward moving direction and a road width direction.
(2): In (1), the recognition unit recognizes a position and a state of an object on the periphery of the host vehicle. The route determination device further includes an estimation unit that estimates a future position of the object based on a position and a state of the object recognized by the recognition unit. The evaluation unit applies a cost to each of the plurality of edges based on a future position of the object estimated by the estimation unit.
(3): In (1), the plurality of nodes are located for each lane in the road width direction.
(4): In (1), the evaluation unit individually evaluates one or more routes in which the host vehicle is able to arrive at an target lane at a trailing end of an evaluation section in a case that the target lane at the trailing end of the evaluation section is set.
(5): In (1), the evaluation unit determines that the host vehicle is to travel while following a preceding vehicle in a case that a sum of the costs of all of the plurality of routes exceeds a criterion.
(6): In (1), the recognition unit recognizes a position of an object on the periphery of the host vehicle. The evaluation unit increases a cost to be applied to an edge corresponding to another lane on a side toward which the object is biased, in a case that the position of the object recognized by the recognition unit is biased from a center of a lane.
(7): In (1), the recognition unit recognizes a position and a size of an object on the periphery of the host vehicle. The evaluation unit increases a cost to be applied to an edge on the periphery of the object in a case that the object recognized by the recognition unit is large in size.
(8): In (1), the recognition unit recognizes a direction in which an object on the periphery of the host vehicle moves. The evaluation unit determines a cost to be applied to an edge on the periphery of the object, based on the direction recognized by the recognition unit.
(9): There is provided a vehicle control device including the invention of (1), and a driving control unit that controls the host vehicle to move based on a route selected by the evaluation unit of the route determination device.
(10): There is provided a route determination method using a computer mounted in a host vehicle, comprising: recognizing peripheral circumstances of the host vehicle, evaluating each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on peripheral circumstances of the host vehicle recognized by the recognition unit, and selecting one or more routes from the plurality of routes based on an evaluation result. Each of the plurality of routes is generated by joining at least two of a plurality of edges. Each of the plurality of edges is generated by connecting two virtual nodes among a plurality of virtual nodes. The plurality of virtual nodes is located with space in each of a forward moving direction and a road width direction.
(11): There is provided a non-transitory computer-readable storage medium which stores a program for causing a computer mounted in a host vehicle to recognize peripheral circumstances of the host vehicle, to evaluate each of a plurality of routes based on a sum of costs respectively applied to a plurality of edges based on the peripheral circumstances of the host vehicle recognized by the recognition unit, and to select one or more routes from the plurality of routes based on an evaluation result. Each of the plurality of routes is generated by joining at least two of a plurality of edges. Each of the plurality of edges is generated by connecting two virtual nodes among a plurality of virtual nodes. The plurality of virtual nodes is located with space in each of a forward moving direction and a road width direction.
According to the aspects of the invention, it is possible to determine a route more quickly and appropriately.
Hereinafter, an embodiment of a route determination device, a vehicle control device, a route determination method, and a storage medium according to the present invention will be described with reference to the drawings.
[Overall Configuration]
A configuration of the present embodiment in which a route determination device is applied to an automated drive vehicle will be described.
For example, the vehicle system 1 includes a camera 10, a radar device 12, a finder 14, an object recognition device 16, a communication device 20, a human machine interface (HMI) 30, a navigation device 50, a micro-processing unit (MPU) 60, a vehicle sensor 70, a drive operation piece 80, an automated drive control unit 100, a traveling driving force output device 200, a brake device 210, and a steering device 220. The devices and instruments are connected to one another through a multiplex communication line such as a controller area network (CAN) communication line, a serial communication line, a radio communication network, or the like. The configuration illustrated in
For example, the camera 10 is a digital camera utilizing a solid-state image sensing device such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS). One or a plurality of cameras 10 are attached in arbitrary locations in a vehicle (which will hereinafter be referred to as a host vehicle M) in which the vehicle system 1 is mounted. In a case of capturing an image of an area ahead, the camera 10 is attached to an upper portion of a front windshield, the rear surface of a rearview mirror, or the like. For example, the camera 10 captures an image of the surroundings of the host vehicle M periodically and repetitively. The camera 10 may be a stereo camera.
The radar device 12 detects at least the position (distance and azimuth) of an object by emitting radio waves such as millimeter waves on the periphery of the host vehicle M and detecting the radio waves (reflected waves) reflected by the object. One or a plurality of radar devices 12 are attached in arbitrary locations in the host vehicle M. The radar device 12 may detect the position and the speed of an object by a frequency modulated continuous wave (FM-CW) method.
The finder 14 is light detection and ranging or laser imaging detection and ranging (LIDAR) measuring scattered light with respect to irradiation light and detecting the distance to a target. One or a plurality of finders 14 are attached in arbitrary locations in the host vehicle M.
The object recognition device 16 performs sensor fusion processing with respect to a part or all detection results of the camera 10, the radar device 12, and the finder 14, thereby recognizing the position, the type, the speed, and the like of an object. The object recognition device 16 outputs a recognition result to the automated drive control unit 100.
The communication device 20 communicates with a different vehicle present on the periphery of the host vehicle M utilizing, for example, a cellular network, a Wi-Fi network, Bluetooth (registered trademark), or a dedicated short range communication (DSRC) or communicates with various server devices via a radio base station.
The HMI 30 presents various types of information to an occupant of the host vehicle M and receives an input operation of the occupant. The HMI 30 includes various display devices, a speaker, a buzzer, a touch panel, a switch, keys and the like.
For example, the navigation device 50 includes a global navigation satellite system (GNSS) receiver 51, a navigation HMI 52, and a route determination unit 53 and retains first map information 54 in a storage device such as a hard disk drive (HDD) or a flash memory. The GNSS receiver specifies the position of the host vehicle M based on a signal received from a GNSS satellite. The position of the host vehicle M may be specified or complemented by an inertial navigation system (INS) utilizing an output of the vehicle sensor 70. The navigation HMI 52 includes a display device, a speaker, a touch panel, keys, and the like. A part or all of the navigation HMI 52 may be shared by the HMI 30 described above. For example, with reference to the first map information 54, the route determination unit 53 determines a route (hereinafter, a route on a map) to a destination input by an occupant using the navigation HMI 52, from the position of the host vehicle M (or an input arbitrary position) specified by the GNSS receiver 51. For example, the first map information 54 is information in which a road shape is expressed with links indicating a road, and nodes connected by the links. The first map information 54 may include curvatures, point of interest (POI) information, and the like of a road. A route on a map determined by the route determination unit 53 is output to the MPU 60. The navigation device 50 may perform route guidance using the navigation HMI 52 based on a route on a map determined by the route determination unit 53. For example, the navigation device 50 may be generated by a function of a terminal device such as a smart phone or a tablet terminal possessed by a user. The navigation device 50 may transmit a current position and a destination to a navigation server via the communication device 20 and may acquire a route on a map sent back from the navigation server.
For example, the MPU 60 functions as a recommendation lane determination unit 61 and retains second map information 62 in the storage device such as an HDD or a flash memory. The recommendation lane determination unit 61 divides a route provided from the navigation device 50 into a plurality of blocks (for example, divides a route every 100 m related to a forward moving direction of a vehicle) and determines a recommendation lane for each block with reference to the second map information 62. The recommendation lane determination unit 61 makes a determination, such as in which lane from the left to travel. When a bifurcated location, a merging location, or the like is present in a route, the recommendation lane determination unit 61 determines a recommendation lane such that the host vehicle M can travel along a reasonable route to move ahead of the bifurcation.
The second map information 62 is highly precise map information compared to the first map information 54. For example, the second map information 62 includes information of the center of a lane, information of the demarcation of a lane, and the like. The second map information 62 may include road information, traffic regulation information, address information (address and zip code), facility information, telephone number information, and the like. The road information includes information indicating the type of a road such as an expressway, a toll road, a national highway, and a local road; and information such as the number of lanes of the road, the width of each lane, the gradient of the road, the position (three-dimensional coordinates including the longitude, the latitude, and the height) of the road, the curvature of lane curves, the positions of merging and the bifurcation points of the lanes, signs provided on roads, and the like. The second map information 62 may be updated any time through access to a different device using the communication device 20.
The vehicle sensor 70 includes a vehicle speed sensor detecting the speed of the host vehicle M, an acceleration sensor detecting acceleration, a yaw rate sensor detecting an angular speed around a vertical axis, an azimuth sensor detecting the direction of the host vehicle M, and the like.
For example, the drive operation piece 80 includes an accelerator pedal, a brake pedal, a shift lever, a steering wheel, and other operation pieces. A sensor detecting an operation amount or the presence or absence of an operation is attached to the drive operation piece 80, and the detection result thereof is output to one or both of the automated drive control unit 100; and the traveling driving force output device 200, the brake device 210, and the steering device 220.
For example, the automated drive control unit 100 includes a first control unit 120 and a second control unit 140. Each of the first control unit 120 and the second control unit 140 is realized when a processor such as a central processing unit (CPU) executes a program (software). The program may be stored in a storage device such as a hard disk drive (HDD) or a flash memory in advance. The program may be stored in an attachable/detachable storage medium such as a DVD or a CD-ROM and may be installed in the storage device when a drive device is equipped with the storage medium. A part or all functional portions of the first control unit 120 and the second control unit 140 may be realized by hardware (circuit section; including circuitry) such as a large scale integration (LSI), an application specific integrated circuit (ASIC), and a field-programmable gate array (FPGA) or may be realized by software and hardware in cooperation. The automated drive control unit 100 is an example of the “vehicle control device”.
For example, the first control unit 120 includes an exterior recognition unit 121, a host vehicle position recognition unit 122, and an action plan generation unit 123. The exterior recognition unit 121 includes an object type discrimination unit 121A. The action plan generation unit 123 includes an edge postulation unit 123A, a future position estimation unit 123B, and an evaluation unit 123C. A combination of the object type discrimination unit 121A, the edge postulation unit 123A, the future position estimation unit 123B, and the evaluation unit 123C is an example of the route determination device.
The exterior recognition unit 121 recognizes the state (the position, the speed, the acceleration, and the like) of an object on the periphery of the host vehicle M based on information input via the object recognition device 16 from the camera 10, the radar device 12, and the finder 14. The position of an object may be indicated with a representative point, such as the center of gravity or the corner of the object. The position of an object may be indicated with an expressed region. The “state” of an object may include acceleration or jerk of an object, or an “action state” (for example, whether or not an object is making or intends to make a lane change).
The object type discrimination unit 121A of the exterior recognition unit 121 discriminates the type of an object (a heavy vehicle, a general vehicle, a truck, a two-wheeled vehicle, a pedestrian, a guardrail, a utility pole, a parked vehicle, or the like). For example, the object type discrimination unit 121A discriminates the type of an object based on the size or the shape of the object in an image captured by the camera 10, received intensity of the radar device 12, and other information.
For example, the host vehicle position recognition unit 122 recognizes the lane (traveling lane) in which the host vehicle M is traveling, and the relative position and posture of the host vehicle M with respect to the traveling lane. For example, the host vehicle position recognition unit 122 recognizes a traveling lane by comparing the pattern (for example, a layout of solid lines and dotted lines) of road demarcation lines obtained from the second map information 62, and the pattern of road demarcation lines on the periphery of the host vehicle M recognized from an image captured by the camera 10. The position of the host vehicle M acquired from the navigation device 50 or a processing result of the INS may be added to this recognition.
For example, the host vehicle position recognition unit 122 recognizes the position or the posture of the host vehicle M with respect to the traveling lane.
The action plan generation unit 123 determines events which are sequentially executed in automated driving, such that the host vehicle M travels in a recommendation lane determined by the recommendation lane determination unit 61 and can cope with peripheral circumstances of the host vehicle M. Examples of the events include a constant-speed traveling event of traveling in the same traveling lane at a constant speed, a following traveling event of following a preceding vehicle, a passing event of passing a preceding vehicle, an avoiding event of avoiding an obstacle, a lane-change event, a merging event, a bifurcating event, an emergency stopping event, and a handover event of ending automated driving and switching over to manual driving. Sometimes action for avoidance is planned based on peripheral circumstances of the host vehicle M (the presence of a peripheral vehicle or a pedestrian, a lane narrowed due to road work, and the like) while the events are executed.
The action plan generation unit 123 generates an target course in which the host vehicle M will travel in the future, by means of functions of the edge postulation unit 123A, the future position estimation unit 123B, and the evaluation unit 123C. Each of the functional portions will be described below in detail. For example, an target course includes a speed factor. For example, an target course is expressed as a course generated by sequentially arranging spots (course points) at which the host vehicle M ought to arrive. The course point is a spot which is provided every predetermined traveling distance and at which the host vehicle M ought to arrive. Apart from that, an target speed and target acceleration for each predetermined sampling time (for example, approximately several tenths of a second) are generated as a part of an target course. The course point may be a position at which the host vehicle M ought to arrive at the sampling time thereof for each predetermined sampling time. In this case, information of the target speed or the target acceleration is expressed at intervals of the course points.
The second control unit 140 includes a traveling control unit 141. The traveling control unit 141 controls the traveling driving force output device 200, the brake device 210, and the steering device 220 such that the host vehicle M passes through an target course generated by the action plan generation unit 123, at a scheduled time.
The traveling driving force output device 200 outputs a traveling driving force (torque) to driving wheels such that a vehicle travels. For example, the traveling driving force output device 200 includes a combination of an internal combustion engine, an electric motor, and a gearbox, and an ECU which controls these. The ECU controls the configuration described above in accordance with information input from the traveling control unit 141 or information input from the drive operation piece 80.
For example, the brake device 210 includes a brake caliper, a cylinder which transfers a hydraulic pressure to the brake caliper, an electric motor which generates a hydraulic pressure in the cylinder, and a brake ECU. The brake ECU controls the electric motor in accordance with information input from the traveling control unit 141 or information input from the drive operation piece 80 such that a brake torque in response to a braking operation is output to each of wheels. The brake device 210 may include, as a back-up, a mechanism for transferring a hydraulic pressure, which is generated by operating the brake pedal included in the drive operation piece 80, to the cylinder via a master cylinder. The brake device 210 is not limited to the configuration described above and may be an electronic control hydraulic brake device which controls an actuator in accordance with information input from the traveling control unit 141 and transfers a hydraulic pressure of the master cylinder to the cylinder.
For example, the steering device 220 includes a steering ECU and an electric motor. For example, the electric motor causes a force to act on a rack-and-pinion mechanism and changes the direction of turning wheels. The steering ECU drives the electric motor in accordance with information input from the traveling control unit 141 or information input from the drive operation piece 80 and changes the direction of the turning wheels.
[Course Determination Based on Route Analysis]
Hereinafter, an example of a technique of generating an target course using the action plan generation unit 123 will be described. For example, the technique described below is executed when events which are a part of the various events described above have started.
(Node and Edge)
For example, the edge postulation unit 123A postulates a plurality of nodes ND for each predetermined distance related to the forward moving direction S of the host vehicle M. On the other hand, the edge postulation unit 123A postulates a plurality of nodes ND for each distance at which the host vehicle M arrives for each predetermined time, related to the forward moving direction S of the host vehicle M. For example, the edge postulation unit 123A postulates each of a plurality of nodes ND for each lane such that the plurality of nodes ND are positioned at the center of the lane, related to the road width direction W. In the description below, sometimes a node ND marked with circled number 1 in
The edge ED may be set between only two adjacent nodes ND related to the forward moving direction S or the road width direction W or may be set between two arbitrary nodes ND selected from all of the nodes ND. In the latter case, an edge between two nodes ND which are not adjacent to each other is excluded from a plurality of edges configuring a route, by sufficiently increasing a cost (described below). In description below, the technique of the latter case will be employed.
The future position estimation unit 123B estimates a future position of an object (particularly, a peripheral vehicle) recognized by the exterior recognition unit 121. For example, the future position estimation unit 123B estimates a future position of an object, assuming that the object performs a uniform speed motion. In place thereof, the future position estimation unit 123B may estimate a future position of an object, assuming that the object performs a uniform acceleration motion and a uniform jerk motion. For example, the future position estimation unit 123B estimates a future position of an object in regard to future timing for each predetermined time or each movement of the host vehicle M corresponding to one node ND in the forward moving direction S. A time required when the host vehicle M moves as much as one grid in the forward moving direction S in an target course candidate depends on a future speed change of the host vehicle M. However, for example, the future position estimation unit 123B sets future timing, assuming that the host vehicle M makes arbitrary motions which can be estimated, such as a uniform speed motion, a uniform acceleration motion, a uniform jerk motion, and a motion based on an estimated motion model calculated from a probability statistic model.
The evaluation unit 123C evaluates a plurality of routes. Each of the plurality of routes is generated by joining a plurality of edges ED. For example, the evaluation unit 123C evaluates all of the routes which can be generated with combinations of the plurality of edges ED. For example, in the example of
When no target lane is designated at a trailing end of the evaluation section A1 in advance, all of a node (16), a node (17), and a node (18) can be final nodes ND (which will hereinafter be referred to as trailing end nodes). In this case, the evaluation unit 123C postulates a combination of all of the edges ED from the node (2) to the node (16), a combination of all of the edges ED from the node (2) to the node (17), and a combination of all of the edges ED from the node (2) to the node (18) and performs an evaluation based on the sum of costs for each edge ED in regard to all thereof. Examples of combinations of all of the edges ED from the node (2) to the node (16) include the edges (2, 1), (1, 4), (4, 7), (7, 10), (10, 13), and (13, 16), the edges (2, 5), (5, 8), (8, 7), (7-10), (10, 13), and (13, 16), and the like. Here, “a case in which no target lane is designated in advance” may include a case in which although the “recommendation lane” described above is determined, the host vehicle M does not necessarily need to travel in the recommendation lane during controlling. Hereinafter, this state will be sometimes referred to as “in free traveling”.
Meanwhile, when an target lane is designated at the trailing end of the evaluation section A1 in advance, the evaluation unit 123C postulates a combination of all of the edges ED from the node (2) to a trailing end node corresponding to the target lane and performs an evaluation based on the sum of costs for each edge ED in regard to all thereof. For example, “a case in which an target lane is designated in advance” is a case in which the host vehicle M needs to exit a main line to a bifurcated road in order to be headed for a destination and travels while setting a lane on the bifurcated road side as an target lane. It may be considered as a case in which a bifurcating event or a lane-change event has started.
Here, when a route generated by joining edges ED, rules such as “consecutively selected edges ED are connected to each other by a node ND” and “there is no going back (for example, the host vehicle M does not move from a node (5) to the node (2) or does not move to the node (5) again after moving from the node (5) to the node (4)” are applied.
The evaluation unit 123C applies a cost to the edge ED based on a future position and the like of an object estimated by the future position estimation unit 123B based on the position of the object recognized by the exterior recognition unit 121. First, the evaluation unit 123C sets NG nodes in order to calculate the cost.
(Ng Node)
At the time t1, the representative point (for example, the central portion of a front end portion related to a vehicle width direction) of the host vehicle M has not entered the NG zone. Therefore, the node (5) closest to the position of the host vehicle M at the time t1 does not become an NG node.
At the time t2, the representative point of the host vehicle M has not entered the NG zone. Therefore, a node (8) closest to the position of the host vehicle M at the time t2 does not become an NG node.
At a time t3, the representative point of the host vehicle M has entered the NG zone. Therefore, a node (11) closest to the position of the host vehicle M at the time t3 becomes an NG node.
At a time t4, the representative point of the host vehicle M has entered the NG zone. Therefore, a node (14) closest to the position of the host vehicle M at the time t4 becomes an NG node.
At a time t5, the representative point of the host vehicle M has entered the NG zone. Therefore, the node (17) closest to the position of the host vehicle M at the time t5 becomes an NG node.
(Edge Cost Table)
The evaluation unit 123C applies a score to the edge ED by the technique described below. In order to describe the score applied to the edge ED, a concept of an edge cost table is introduced. The edge cost table may be ensured as a layout in a RAM or the like or may be retained as data in a different form (for example, in the form of a list). In the description below, for convenience, processing of updating the edge cost table in stages is described. However, the evaluation unit 123C need only perform processing similar thereto and may perform processing in any procedure.
Summarizing the above, the size relationship of the cost set for the edge ED is indicated in the following order. The cost is not limited to the following four types and may be adjusted based on a bias or the like within a lane of the peripheral vehicle m as described below.
<<Significant>>
(A) The edge ED connecting two nodes ND which are not adjacent to each other
(B) The edge ED connected to the NG node
(C) The edge ED connecting two nodes ND which are adjacent to each other and not corresponding to (B) or (C).
(D) The edge ED connected in the road width direction W with respect to the nodes ND adjacent to the NG nodes related to the forward moving direction S.
<<Small>>
When a cost is set for the edge ED, the evaluation unit 123C selects a route in accordance with any of the followings. When being in free traveling, the evaluation unit 123C calculates (evaluates) the sum of the costs of the edges ED configuring a route for all of the routes from the node (2) to all of the nodes ND ((16), (17), and (18) in the example of
Meanwhile, in a situation in which an target lane is designated in advance, the evaluation unit 123C calculates (evaluates) the sum of the costs of the edges ED configuring a route for all of the routes from the node (2) to the node ND (goal node) corresponding to an target lane at the trailing end of the evaluation section A1. A route having the least total cost is determined as the route in which the host vehicle M ought to travel. Similar to that described above, when all of the calculated total costs are equal to or greater than the threshold value, the evaluation unit 123C determines that it is preferable to perform following traveling without performing passing or making a lane change.
If the route is determined by the evaluation unit 123C, the action plan generation unit 123 generates a smooth curve which approaches each of the nodes ND on the determined route and is indicated by a spline function or the like. Then, the action plan generation unit 123 sets the above-described course points on the curve. Accordingly, course determination is generated based on route analysis.
(Calculation of NG Zone)
Here, a technique of setting an NG zone will be described.
For example, the evaluation unit 123C calculates a length NGsf of the NG zone on a side ahead of the peripheral vehicle m based on Expression (1) and calculates a length NGsr of the NG zone on a side behind the peripheral vehicle m based on Expression (2). Here, a speed of the host vehicle in the forward moving direction is vxego, and a speed of the peripheral vehicle m in the forward moving direction is vxother.
NGsf=vxother×tLC+margin (1)
NGsr=vxego×tLC+margin (2)
((Case in which Host Vehicle and Peripheral Vehicle are in the Same Lane))
(a) First, a case in which the speed of the peripheral vehicle m in the road width direction is zero (or smaller than a criterion) will be described. In this case, for example, the evaluation unit 123C calculates a width NGwl of the NG zone on the left side seen from the front-rear directional axis Cm of the peripheral vehicle m based on Expression (3) and calculates a width NGwr of the NG zone on the right side seen from the front-rear directional axis Cm of the peripheral vehicle m based on Expression (4). For example, a constant value const matches half the length CWH of the lane width. A function c(VW) indicates a tendency having the vehicle width VW of the peripheral vehicle m as a parameter and increasing when the vehicle width (VW) increases.
NGwl=c(VW)·const (3)
NGwr=c(VW)·const (4)
The function c(VW) having a tendency increasing in accordance with an increase of the vehicle width VW of the peripheral vehicle m. Therefore, the evaluation unit 123C increases a cost to be applied to an edge on the periphery of the peripheral vehicle m when the peripheral vehicle m is large in size.
(b) Next, a case in which the speed of the peripheral vehicle m in the road width direction is not zero (or greater than the criterion) will be described. In description of this paragraph, a case which will be described in the next paragraph is omitted.
NGwl=c(VW)·max(|vy
NGwr=c(VW)·const (6)
(c) Next, a case in which the speed of the host vehicle M in the road width direction is not zero (or greater than the criterion), the speed of the peripheral vehicle m in the road width direction is not zero (or greater than the criterion), and the speeds are caused in the same direction will be described. Excluding the case described above, other cases may be classified into any of (a) and (b).
NGwl=c(VW)·max(|vy
NGwr=c(VW)·max(|vy
Expressions (3) to (8) are established on the premise of a case in which the host vehicle M and the peripheral vehicle m are in the same lane, that is, a case in which a position yego of the host vehicle M in the road width direction≈a position yother of the peripheral vehicle m in the road width direction. In addition to thereof, when the host vehicle M and the peripheral vehicle m are in different lanes, that is, when yego≠yother, the evaluation unit 123C may perform a calculation different from those based on Expressions (3) to (8).
((Case in which Host Vehicle and Peripheral Vehicle are in Different Lanes))
In this case, (A) when the speeds of both the host vehicle M and the peripheral vehicle m in the road width direction are zero (or smaller than the criterion), and (B) when one or both the host vehicle M and the peripheral vehicle m have speeds in the road width direction but the speeds are in directions separated from each other, the evaluation unit 123C may set NGwl=NGwr=c(VW)·const without particularly considering the speed in the road width direction.
(C) Therefore, when the host vehicle M and the peripheral vehicle m are in different lanes, the evaluation unit 123C sets the width NGwl and NGwr of the NG zone as follows when any one or both of the host vehicle M and the peripheral vehicle m have speeds in the road width direction and the speeds are in a direction in which the host vehicle M and the peripheral vehicle m approach each other.
NGwl=c(VW)·max(|vy
NGwr=c(VW)·const (10)
NGwl=c(VW)·const (11)
NGwr=c(VW)·max(|vy
NGwl=c(VW)·max(|vy
NGwr=c(VW)·const (14)
From the relationships of Expressions (1) to (14), it is understood that the relationship between the length NGsf of the NG zone and the widths NGwl and NGwr of the NG zone depends on an inclination (vyother/vxother) of the direction of the peripheral vehicle m with respect to a road. Therefore, the evaluation unit 123C sets the NG zone based on an inclination of the direction of the peripheral vehicle m with respect to a road, thereby determining the cost.
The NG zone is also applied to the peripheral vehicle m present behind the host vehicle M. That is, when the peripheral vehicle m approaches from behind at a speed higher than that of the host vehicle M, the host vehicle M is controlled to make a lane change or to accelerate such that the host vehicle M is not caught up in the NG zone on a side ahead of the peripheral vehicle m.
According to the processing, it is possible to set an NG zone based on a motion along the road width direction W of the peripheral vehicle m, to set a high cost in regard to a location to which the peripheral vehicle m may make a lane change, and to determine a route in a direction not traveling therein.
(Revision of Cost and Position of Node in Accordance with Positional Bias of Peripheral Vehicle)
β=γ×ew/CWH (15)
According to the processing, it is possible to set a high cost in regard to a part in a cramped state due to the peripheral vehicle m which has leaned thereto, and to determine a route in a direction not traveling therein.
toverlap−tc≤(arrival time for host vehicle M to node)<toverlap+tc (16)
toverlap=D/{max(|vxego−vxother|,ε)} (17)
(Processing Flow)
Next, the edge postulation unit 123A postulates nodes ND and edges ED in the target section A1 (Step S102). Next, the edge postulation unit 123A determines whether or not any peripheral vehicle m leaning in the road width direction W is present (Step S104). When a peripheral vehicle m is present, the edge postulation unit 123A revises the nodes ND and the edges ED (Step S106).
Next, the evaluation unit 123C sets NG nodes and an NG zone (Step S108) and applies costs to the edges ED (Step S110).
Next, the evaluation unit 123C determines whether or not the host vehicle M is in free traveling (Step S112). When the host vehicle M is in free traveling, the evaluation unit 123C searches for all of the routes leading to all of the trailing end nodes (Step S114). When not in free traveling, the evaluation unit 123C searches for all of the routes leading to the trailing end node corresponding to an target lane (Step S116). The evaluation unit 123C selects a route having the lowest total cost (Step S118) and determines whether or not the total cost of the selected route is equal to or greater than the threshold value (Step S120). When the total cost of the selected route is less than the threshold value, the action plan generation unit 123 generates a course based on the route selected in Step S118 (Step S122). Meanwhile, when the total cost of the selected route is equal to or greater than the threshold value, the action plan generation unit 123 determines to perform following traveling (Step S124).
(Simulation Result)
The inventor of this application has performed a simulation in which the functions of the embodiment described above are reproduced in a computer. The simulation result is as follows.
According to the vehicle system 1 (route determination device) of the embodiment, it is possible to determine a route more quickly and appropriately by recognizing peripheral circumstances of the host vehicle M and evaluating the route generated by joining the edges connecting the virtual nodes which are located with space in each of the forward moving direction and the road width direction on a side in the forward moving direction of the host vehicle M, based on the sum of costs applied to the edges based on peripheral circumstances of the host vehicle.
As illustrated in
Hereinabove, a form of executing the present invention has been described using the embodiment. However, the present invention is not limited to such an embodiment in any events, and various changes and replacements may be added within a range not departing from the gist of the present invention.
While preferred embodiments of the invention have been described and illustrated above, it should be understood that these are exemplary of the invention and are not to be considered as limiting. Additions, omissions, substitutions, and other modifications can be made without departing from the spirit or scope of the present invention. Accordingly, the invention is not to be considered as being limited by the foregoing description, and is only limited by the scope of the appended claims.
Patent | Priority | Assignee | Title |
11970168, | Nov 05 2018 | Zoox, Inc. | Vehicle trajectory modification for following |
Patent | Priority | Assignee | Title |
20150345966, | |||
20190086226, | |||
JP5614055, | |||
WO2017168662, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 23 2018 | SUMIOKA, TADASHI | HONDA MOTOR CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 045058 | /0469 | |
Feb 28 2018 | Honda Motor Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Feb 28 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jul 01 2024 | REM: Maintenance Fee Reminder Mailed. |
Dec 16 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Nov 10 2023 | 4 years fee payment window open |
May 10 2024 | 6 months grace period start (w surcharge) |
Nov 10 2024 | patent expiry (for year 4) |
Nov 10 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 10 2027 | 8 years fee payment window open |
May 10 2028 | 6 months grace period start (w surcharge) |
Nov 10 2028 | patent expiry (for year 8) |
Nov 10 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 10 2031 | 12 years fee payment window open |
May 10 2032 | 6 months grace period start (w surcharge) |
Nov 10 2032 | patent expiry (for year 12) |
Nov 10 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |