Disclosed is a steering control system and method for autonomous intelligent vehicles. The system includes image input means for supplying images of in front of the vehicle; a plurality of image grabbers which receive images from the image input means and capture image signals corresponding to the road; a first controller determining if the vehicle is being driven within the lane using near image signals received from the image grabbers; a second controller determining a driving direction of the vehicle and detecting curves in the road using distant image signals received from the image grabbers; a steering controller analyzing the information received from the first and second controllers to determine a steering angle and direction, and which outputs control signals corresponding to the analysis; and drive means for driving a steering system of the vehicle in a direction and angle corresponding to the control signals received from the steering controller.
|
8. A steering control system for a vehicle, comprising:
a first camera for capturing first images appearing in front of the vehicle; a second camera for capturing second images appearing in front of the vehicle, said second images being farther away from the vehicle than said first images; a first controller having a first output responsive to said first images; a second controller having a second output responsive to said second images; a steering controller for generating a steering angle and steering direction as a function of said first and second outputs; and an automated driver for steering the vehicle in the steering angle and the steering direction.
4. A steering control method for steering an autonomous intelligent vehicle with a steering system on a road, the method comprising the steps of:
supplying near and distant images appearing in front of the vehicle after the images are processed to a predetermined state; capturing image signals corresponding to the road from the near and distant images, and determining a driving direction of the vehicle and detecting curves in the road from the distant images, and determining if the vehicle is being driven off-center from a road lane marker from the captured near images; determining a steering angle and a steering direction by analyzing the driving direction, the curves in the road and whether the vehicle is being driven off-center from the road lane marker; and driving the steering system of the vehicle according to the determined steering angle and the steering direction.
9. A vehicle with an autonomous steering control system comprising:
image input means for supplying images appearing in front of the vehicle, said image input means comprising a plurality of elements mounted at predetermined positions on the vehicle; a plurality of image grabbers for receiving images supplied from the image input means, and capturing image signals corresponding to a road, said captured images comprising near image signals and distant image signals; a first controller for determining if the vehicle is being driven within a lane of the road using the near image signals captured by the image grabbers; a second controller for determining a driving direction of the vehicle and detecting curves in the road using the distant image signals captured by the image grabbers; a steering controller for analyzing the determinations of the first and second controllers and outputting control signals comprising a steering angle and a steering direction corresponding to the analysis; and drive means for driving the steering system of the vehicle in the steering direction and the steering angle corresponding to the control signals outputted from the steering controller.
1. A steering control system for an autonomous intelligent vehicle having a steering system, said steering control system comprising:
image input means for supplying images appearing in front of the vehicle, said image input means comprising a plurality of elements mounted at predetermined positions on the vehicle; a plurality of image grabbers receiving images supplied from the image input means, and capturing image signals corresponding to a road, said captured images comprising near image signals and distant image signals; a first controller determining if the vehicle is being driven within a lane of the road using the near image signals captured by the image grabbers; a second controller determining a driving direction of the vehicle and detecting curves in the road using the distant image signals captured by the image grabbers; a steering controller analyzing determinations of the first and second controllers and outputting control signals comprising a steering angle and a steering direction corresponding to the analysis; and drive means for driving the steering system of the vehicle in the steering direction and the steering angle corresponding to the control signals outputted from the steering controller.
2. The steering control system of
3. The steering control system of
5. The steering control method of
αnθn+α(n+1)θ(n+1)+a(n+2)θ(n+2)+. . .=θtotal. 6. The steering control method of
steering Angle=β×off-center rate+γ×driving direction+δ×vehicle speed, where β, γ, and δ are measured ratio constants, and β>γ>δ. 7. The steering control method of
where αx is the angle made by an intersection of a central line with a horizontal line x, θx is the ratio constant of the road for an interval between horizontal lines, and k is a finite number greater than n.
10. The vehicle of
11. The vehicle of
|
(a) Field of the Invention
The present invention relates to an autonomous intelligent vehicle. More particularly, the present invention relates to a steering control system and method for an autonomous intelligent vehicle in which information of near and distant road conditions is received through cameras and used to control the steering of a vehicle.
(b) Description of the Related Art
There has been continued, rapid development in automotive technology since the inception of the automobile industry. In recent times, advances have been concentrated more in the area of electronics than in mechanics. For example, there have been many electronics-related developments that improve engine performance and efficiency, in addition to more recent advances that provide intelligent safety capabilities (e.g., air bags that are activated for operation only if the passenger is over a predetermined weight) and technology that will enable the application of vehicles that can drive without the aid of a driver.
With regard to such autonomous intelligent vehicles, various institutions and organizations are vigorously pursuing research to perfect this technology. An example is the NAVLAB vehicle developed in the United States which is capable of driving by itself on a road that is clear of obstacles. The GRAFE vehicle developed in Germany has capabilities that its developers claim enable the vehicle to travel driver-free on the Autobahn. These and other conventional autonomous intelligent vehicles utilize a camera that supplies road images which are processed and analyzed. From the analysis, automatic controls are performed to maintain the vehicle in a certain lane or side of the road by steering the vehicle, and to maintain the vehicle at a suitable speed.
However, since only a single camera is used in these conventional vehicles to detect road conditions, either smooth cornering or precise lane maintenance is compromised. That is, if the single camera is positioned at an optimal location on the vehicle to maintain the vehicle precisely in the car lane, approaching curves in the road can not be detected such that steering control through the curve is not smooth. On the other hand, if the camera is positioned to detect approaching turns in the road, minute adjustments to maintain the vehicle precisely in the center of the car lane can not be made.
The present invention has been made in an effort to solve the above problems.
It is an object of the present invention to provide a steering control system and method for autonomous intelligent vehicles in which two or more cameras are utilized and detected image signals are processed in parallel such that control for both approaching curves and to maintain the vehicle precisely between the road lane markers can be performed.
To achieve the above object, the present invention provides a steering control system and method for autonomous intelligent vehicles. The system includes image input means for supplying images of in front of the vehicle, and a plurality of elements of which are mounted at predetermined locations on the vehicle; a plurality of image grabbers which receive images from the image input means and capture image signals corresponding to the road; a first controller determining if the vehicle is being driven within the lane using near image signals received from the image grabbers; a second controller determining a driving direction of the vehicle and detecting curves in the road using distant image signals received from the image grabbers; a steering controller analyzing the information received from the first and second controllers to determine a steering angle and direction, and which outputs control signals corresponding to the analysis; and drive means for driving a steering system of the vehicle in a direction and angle corresponding to the control signals received from the steering controller.
According to a feature of the present invention, the image input means comprises first and second cameras mounted on opposite sides of the front of a vehicle; and a third camera mounted to an upper, center portion of the windshield of a vehicle.
According to another feature of the present invention, the plurality of image grabbers comprises a first, second and third image grabber corresponding respectively to the first, second and third cameras, the image grabbers processing image data received from the cameras.
The method of the present invention includes the steps of supplying near and distant images of in front of a vehicle after the images are processed to a predetermined state; capturing image signals corresponding to the road from the near and distant images, and determining a driving direction of the vehicle and detecting curves in the road from the distant images, and determining if the vehicle is being driven off-center from road lane markers using the near images; determining a steering angle and direction by analyzing the information on the driving direction, curves in the road and whether the vehicle is being driven off-center from the road lane markers; and driving a steering system of the vehicle according to the determined steering angle and direction to control steering of the vehicle.
According to a feature of the present invention, curves in the road are detected from the distant images using Equation 1 below:
αnθn+α(n+1)θ(n+1)+a(n+2)θ(n+2)+. . .=θtotal Equation 1
According to another feature of the present invention, the steering angle is determined using Equation 2 below:
Steering Control=β×off-center rate+γ×driving direction+δ×vehicle speed, Equation 2
where β, γ and δ are measured ratio constants, and β>γ>δ.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate an embodiment of the invention, and, together with the description, serve to explain the principles of the invention:
FIG. 1 is a perspective view of an autonomous intelligent vehicle used to describe positioning of cameras utilized in a preferred embodiment of the present invention;
FIG. 2 is a block diagram of a steering control system for an autonomous intelligent vehicle according to a preferred embodiment of the present invention;
FIG. 3 is a flow chart of a steering control method for an autonomous intelligent vehicle according to a preferred embodiment of the present invention;
FIG. 4a is a schematic view illustrating the road as detected by a second camera according to a preferred embodiment of the present invention;
FIG. 4b is a schematic view illustrating the road as detected by a first camera according to a preferred embodiment of the present invention; and
FIG. 5 is a schematic view illustrating the road as detected by a third camera according to a preferred embodiment of the present invention.
A preferred embodiment of the present invention will now be described in detail with reference to the accompanying drawings.
Referring first to FIG. 2, shown is a block diagram of a steering control system for an autonomous intelligent vehicle according to a preferred embodiment of the present invention. As shown in the drawing, the inventive steering control system comprises first, second and third cameras 100, 200 and 300 provided at predetermined positions on the vehicle, as shown in FIG. 1, and which supply images of the road in front of the vehicle, the first and second cameras 100 and 200 providing images that are near the vehicle, and the third camera 300 providing images that are more distant; first, second and third image grabbers 110, 210 and 310 for processing the images input from the first second and third cameras 100, 200 and 300, respectively, by capturing only image signals corresponding to the road; a first controller 120 receiving the image signals input from the first and second image grabbers 110 and 210, and after extracting the image signals corresponding to lane conditions, determines if the vehicle is being driven within the lane; a second controller 320 for receiving the image signals input from the third image grabber 310, and after extracting the image signals corresponding to the road, determines the present driving state of the vehicle and detects approaching curves; a steering controller 30 which receives data related to road conditions from the first and second controllers 120 and 320 to determine a steering angle of the vehicle, and outputs control signals of the steering angle; and a driver 40 which drives a steering system of the vehicle in a direction and angle corresponding to the control signals received from the steering controller 30.
The operation of the steering control system structured as in the above will be described hereinafter with reference to the flow chart of FIG. 3.
First, in step S20, when the autonomous intelligent vehicle begins to drive, the third camera 300 supplies distant images in front of the vehicle to the third image grabber 310. At this time, the third camera 300 first processes the images into image data information before input to the third image grabber 310.
Next, in step S30, the third image grabber 310 captures the image data information input from the third camera 300 in units of frames, then converts this information into digital signals (using a converter installed therein) and saves the same, after which the digital signals are applied to the second controller 320 in sequence.
Subsequently, the second controller 320 converts the image data received as digital signals into a binary code based on a specific value, then using a Gaussian filter method in which specific brightness levels are detected, the characteristics of the road are extracted as shown in FIG. 5.
Next, in step S40, the second controller 320 divides the road into predetermined horizontal intervals (e.g., 5 meters) and determines a central value of the road by connecting all central values of the detected road lane markers. Angles between a central line CL of the road, determined by the central value, and horizontal lines 1-4, defined by the horizontal intervals, are then obtained to determine if there is an approaching curve in the road, and, if so, the sharpness of the curve. Here also, the second controller determines the present driving direction of the vehicle.
The following Equation 1 below is used to determine the existence and sharpness of approaching curves in the road:
α1θ1+α2θ2+α3θ3+α4θ4=θ total (curve in the road), Equation 1
where α1, α2, α3 and α4 are the angles made by the intersection of the central line CL with the horizontal lines 1, 2, 3 and 4, respectively, as shown in FIG. 5; and θ1, θ2, θ3 and θ4 are ratio constants of the road for each interval. For example, in the case where the horizontal line 1 is 30 m and the horizontal line 4 is 10 m, θ4=1 and θ1=0.33.
In step S50, using the results of Equation 1, the second controller 320 outputs signals corresponding to its calculations to the steering controller 30, which, in turn, outputs control signals to the driver 40. The driver 40 then controls the steering system of the vehicle according to the control signals received from the steering controller 30.
Further, occurring simultaneously in step S20 above, the first and second cameras 100 and 200 supply images respectively of the left and right lane markers of the road, and surrounding areas, to the first and second image grabbers 110 and 210, respectively. At this time, the first and second cameras 100 and 200 first process the images into image data information before input respectively to the first and second image grabbers 110 and 210.
Next, in step S30, the first and second image grabbers 110 and 210 capture the image data information input from the first and second cameras 100 and 200, respectively, in units of frames, then convert this information into digital signals (using a converter installed therein) and save the same, after which the digital signals are applied to the first controller 120 in sequence.
Subsequently, the first controller 120 extracts the characteristics of the road including the road lane markers, as shown in FIGS. 4a and 4b, by performing a predetermined calculation of the image signals input from the first and second image grabbers 110 and 210 using an algorithm. That is, Gauss's elimination method is used to detect the road lane markers from input the image signals.
In more detail, the right road lane marker, as shown in FIG. 4a, is detected from the image signals input from the second image grabber 210, and a distance d1 between the right side of the vehicle and the right road lane marker is determined; and a left road lane marker, as shown in FIG. 4b, is detected from the image signals input from the first image grabber 110, and a distance d2 between the left side of the vehicle and the left road lane marker is determined.
Following the above, in step S40, it is determined using the distances dl and d2 whether the vehicle is in the center of the right and left lane markers. If it is determined in step S40 that the vehicle is off-center, the first controller 120 outputs corresponding information to the steering controller 30, which, in turn, outputs steering control signals to the driver 40. In step S50, using the input steering control signals, the driver 40 drives the steering system of the vehicle to adjust the same in the center of the road lane markers.
In step S50 above, signals received from the first controller 120 result in overall steering control, while signals received from the second controller 320 result in minute adjustments to steering as the vehicle drifts from the center of the road lane markers. In other words, the third camera 300 is used for overall steering and to steer the vehicle through corners, and the first and second cameras 100 and 200 are used to control precise adjustments in steering.
In the present invention, a final calculation is performed to control steering. Since steering using only the above method results in control that is not smooth, a final calculation is performed using the following equation:
Steering Control=β×off-center rate+γ×driving direction+δ×vehicle speed, Equation 2
where β, γ and δ are measured ratio constants, and β>γ>δ.
The off-center rate is obtained by the first controller 120 in step S40 as described above, and the driving direction is obtained by the second controller 320 in step S40 as described above. The values of β, γ and δ are those obtained through experimentation. These values, in addition to the vehicle speed, are used in Equation 2 above, and steering is controlled such that abrupt, jerking movement is prevented during the control of steering.
In the steering control system and method for autonomous vehicles described above, since a plurality of cameras are used to determine both distant and near road conditions, and to make changes and adjustments to steering based on the same, auto-driving control is smooth, accurate and safe.
Other embodiments of the invention will be apparent to the skilled in the art from consideration of the specification and practice of the invention disclosed herein. It is intended that the specification and examples be considered as exemplary only, with the true scope and spirit of the invention being indicated by the following claims.
Patent | Priority | Assignee | Title |
10075650, | Jan 30 2009 | MAGNA ELECTRONICS INC. | Vehicular imaging system with controlled illumination device and camera |
10160485, | Nov 11 2015 | Hyundai Motor Company | Apparatus and method for automatic steering control in vehicle |
10176595, | Mar 25 2015 | NC& CO , LTD | Image processing apparatus having automatic compensation function for image obtained from camera, and method thereof |
10300859, | Jun 10 2016 | MAGNA ELECTRONICS INC | Multi-sensor interior mirror device with image adjustment |
10509972, | Dec 23 2004 | MAGNA ELECTRONICS INC. | Vehicular vision system |
10627816, | Aug 29 2014 | Waymo LLC | Change detection using curve alignment |
10805550, | Jan 30 2009 | MAGNA ELECTRONICS INC. | Vehicular imaging system with controlled illumination device and camera |
11228700, | Oct 07 2015 | MAGNA ELECTRONICS INC. | Vehicle vision system camera with adaptive field of view |
11308720, | Dec 23 2004 | MAGNA ELECTRONICS INC. | Vehicular imaging system |
11327493, | Aug 29 2014 | Waymo LLC | Change detection using curve alignment |
11370433, | May 15 2019 | NISSAN MOTOR CO , LTD | Vehicle travel control method and vehicle travel control apparatus |
11431916, | Jan 30 2009 | MAGNA ELECTRONICS INC. | Vehicular imaging system with controlled illumination device and camera |
11588963, | Oct 07 2015 | MAGNA ELECTRONICS INC. | Vehicle vision system camera with adaptive field of view |
11751112, | Mar 24 2017 | Huawei Technologies Co., Ltd. | Handover method and device |
11829138, | Aug 29 2014 | Waymo LLC | Change detection using curve alignment |
11831972, | Oct 07 2015 | MAGNA ELECTRONICS INC. | Vehicular vision system with adaptive field of view |
6338022, | Jun 16 1999 | Honda Giken Kogyo Kabushiki Kaisha | Vehicle travel safety device |
6996462, | Dec 18 2002 | Daimler AG | Method of controlling at least one autonomously driving vehicle |
7679498, | May 03 2002 | MAGNA ELECTRONICS INC | Object detection system for vehicle |
7877175, | Dec 23 2004 | MAGNA ELECTRONICS, INC | Imaging system for vehicle |
7991522, | Dec 23 2004 | MAGNA ELECTRONICS, INC | Imaging system for vehicle |
8116929, | Dec 23 2004 | MAGNA ELECTRONICS INC | Imaging system for vehicle |
8139109, | Jun 19 2006 | Oshkosh Corporation | Vision system for an autonomous vehicle |
8175331, | Jan 17 2006 | Honda Motor Co., Ltd. | Vehicle surroundings monitoring apparatus, method, and program |
8239086, | Dec 23 2004 | MAGNA ELECTRONICS INC | Imaging system for vehicle |
8289142, | May 03 2002 | MAGNA ELECTRONICS INC | Object detection system for vehicle |
8386114, | Dec 23 2004 | MAGNA ELECTRONICS, INC | Imaging system for vehicle |
8421865, | Oct 24 2008 | Magna Electronics Europe GmbH & Co. KG | Method for calibrating a vehicular camera system |
8459619, | Oct 24 2010 | Oshkosh Corporation | Gas spring control system and method |
8543277, | Dec 23 2004 | MAGNA ELECTRONICS INC | Imaging system for vehicle |
8665079, | May 03 2002 | MAGNA ELECTRONICS INC | Vision system for vehicle |
8762021, | Apr 08 2011 | Toyota Jidosha Kabushiki Kaisha | Driving support system |
8947531, | Jun 19 2006 | Oshkosh Corporation | Vehicle diagnostics based on information communicated between vehicles |
8964032, | Jan 30 2009 | MAGNA ELECTRONICS INC. | Rear illumination system |
9092677, | Dec 14 2011 | Hyundai Motor Company; Kia Corporation | Apparatus and method for recognizing location of vehicle |
9150155, | Jan 13 2010 | MAGNA ELECTRONICS INC | Vehicular camera and method for periodic calibration of vehicular camera |
9296337, | Jan 13 2010 | MAGNA ELECTRONICS INC. | Method of calibrating a vehicular camera |
9420203, | Jun 19 2006 | Oshkosh Defense, LLC | Vision system for a vehicle |
9451236, | Jun 21 2011 | NC& CO , LTD | Apparatus for synthesizing three-dimensional images to visualize surroundings of vehicle and method thereof |
9494438, | Dec 15 2015 | Honda Motor Co., Ltd. | System and method for verifying map data for a vehicle |
9836052, | Aug 29 2014 | Waymo LLC | Change detection using curve alignment |
9840253, | Jun 14 2016 | Aptiv Technologies AG | Lane keeping system for autonomous vehicle during camera drop-outs |
9940528, | Dec 23 2004 | MAGNA ELECTRONICS INC. | Driver assistance system for vehicle |
Patent | Priority | Assignee | Title |
5559695, | Dec 27 1994 | Hughes Electronics Corporation | Apparatus and method for self-calibrating visual time-to-contact sensor |
5568406, | Dec 01 1995 | Stolen car detection system and method | |
5762157, | Feb 20 1995 | Toyota Jidosha Kabushiki Kaisha | Vehicle attitude control apparatus wherein tire slip angle and wheel longitudinal force are controlled |
5908457, | Nov 24 1995 | Koyo Seiko Co., Ltd. | Automobile steering system including reaction feedback to operator |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 28 1998 | KIM, GI-SEOK | Hyundai Motor Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 009511 | /0747 | |
Oct 09 1998 | Hyundai Motor Company | (assignment on the face of the patent) | / | |||
Mar 02 1999 | SEO, JAE-HYUNG | Hyundai Motor Company | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 009920 | /0633 |
Date | Maintenance Fee Events |
Jun 29 2004 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jul 01 2004 | ASPN: Payor Number Assigned. |
Jul 21 2008 | REM: Maintenance Fee Reminder Mailed. |
Jan 09 2009 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Feb 09 2009 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Jan 09 2004 | 4 years fee payment window open |
Jul 09 2004 | 6 months grace period start (w surcharge) |
Jan 09 2005 | patent expiry (for year 4) |
Jan 09 2007 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jan 09 2008 | 8 years fee payment window open |
Jul 09 2008 | 6 months grace period start (w surcharge) |
Jan 09 2009 | patent expiry (for year 8) |
Jan 09 2011 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jan 09 2012 | 12 years fee payment window open |
Jul 09 2012 | 6 months grace period start (w surcharge) |
Jan 09 2013 | patent expiry (for year 12) |
Jan 09 2015 | 2 years to revive unintentionally abandoned end. (for year 12) |