An imaging system and method for a vehicle is provided, and includes an imager configured to image a scene external and forward of the vehicle and to generate image data corresponding to the acquired images. A controller is configured to receive the image data and analyze an optical flow between successive image frames to compute a relative motion between the imager and the imaged scene, wherein the optical flow includes a pattern of apparent motion of objects of interest in the successive image frames.
|
1. An imaging system of a vehicle, comprising:
an imager configured to acquire images of a scene external and forward of the vehicle and to generate image data corresponding to the acquired images; and
a controller configured to:
detect a plurality of objects of interest in the image data, wherein each object of interest comprises one of a moving object and a stationary object;
analyze an optical flow between successive image frames, the optical flow including a pattern of apparent motion of the objects of interest caused by a relative motion between the imager and the scene;
compute a vertical position value based on a change in vertical position for each object of interest appearing in the successive image frames;
compute a vertical motion value based on a change in vertical position for only the objects of interest appearing in the successive image frames as having a common apparent motion in a vertical direction;
determine the relative motion in the vertical direction based on a relationship between the vertical position value and the vertical motion value;
determine the relative motion in a horizontal direction based on a vehicle heading; and
based on the relative motion determined in the vertical and horizontal directions, correct for apparent motion and generate a control signal provided to a vehicle device.
8. An imaging method of a vehicle, comprising the steps of:
using an imager to acquire images of a scene external and forward of the controlled vehicle and generating image data corresponding to the acquired images;
providing a controller for:
detecting a plurality of objects of interest in the image data, wherein each object of interest comprises one of a moving object and a stationary object;
analyzing an optical flow between successive image frames, the optical flow including a pattern of apparent motion of the objects of interest caused by a relative motion between the imager and the scene;
computing a vertical position value based on a change in vertical position for each object of interest appearing in the successive image frames;
computing a vertical motion value based on a change in vertical position for only the objects of interest appearing in the successive image frames as having a common apparent motion in a vertical direction;
determining the relative motion in the vertical direction based on a relationship between the vertical position value and the vertical motion value;
determining the relative motion in a horizontal direction based on a vehicle heading; and
based on the relative motion determined in the vertical and horizontal directions, correcting for apparent motion and generating a control signal provided to a vehicle device.
14. A non-transitory computer-readable medium having stored thereon software instructions that, when executed by a processor, comprise the steps of:
using an imager to acquire images of a scene external and forward of the controlled vehicle and generating image data corresponding to the acquired images;
detecting a plurality of objects of interest in the image data, wherein each object of interest comprises one of a moving object and a stationary object;
analyzing an optical flow between successive image frames, the optical flow including a pattern of apparent motion of the objects of interest caused by a relative motion between the imager and the scene;
computing a vertical position value based on a change in vertical position for each object of interest appearing in the successive image frames;
computing a vertical motion value based on a change in vertical position for only the objects of interest appearing in the successive image frames as having a common apparent motion in a vertical direction;
determining the relative motion in the vertical direction based on a relationship between the vertical position value and the vertical motion value;
determining the relative motion in a horizontal direction based on a vehicle heading; and
based on the relative motion determined in the vertical and horizontal directions, correcting for apparent motion and generating a control signal provided to a vehicle device.
2. The imaging system of
3. The imaging system of
4. The imaging system of
5. The imaging system of
6. The imaging system of
9. The imaging method of
10. The imaging method of
11. The imaging method of
12. The imaging method of
13. The imaging method of
15. The non-transitory computer-readable medium of
16. The non-transitory computer-readable medium of
17. The non-transitory computer-readable medium of
18. The non-transitory computer-readable medium of
19. The non-transitory computer-readable medium of
|
This application claims priority to and the benefit under 35 U.S.C. §119(e) of U.S. Provisional Patent Application No. 61/867,379, filed on Aug. 19, 2013, entitled “SYSTEM AND METHOD FOR CONTROLLING EXTERIOR VEHICLE LIGHTS WITH EGO MOTION ESTIMATION,” the entire disclosure of which is hereby incorporated herein by reference.
The present invention generally relates to imaging systems, and more specifically to imaging systems for use with a vehicle.
According to one aspect of the present invention, an imaging system for a vehicle is provided. The system includes an imager configured to image a scene external and forward of the vehicle and to generate image data corresponding to the acquired images. A controller is configured to receive the image data and analyze an optical flow between successive image frames to compute a relative motion between the imager and the imaged scene, wherein the optical flow includes a pattern of apparent motion of objects of interest in the successive image frames.
According to another aspect of the present invention, an imaging method for a vehicle is provided. The method includes the steps of: providing an imager for imaging a scene external and forward of the controlled vehicle and generating image data corresponding to the acquired images; providing a controller for receiving and analyzing the image data; and computing a relative motion between the imager and the imaged scene based on an optical flow between successive image frames, wherein the optical flow includes a pattern of apparent motion of objects of interest in the successive image frames.
According to yet another aspect of the present invention a non-transitory computer-readable medium is provided. The non-transitory readable medium has software instructions stored thereon that, when executed by a processor, include the steps of: using an imager to image a scene external and forward of the controlled vehicle and generating image data corresponding to the acquired images; receiving and analyzing the image data in a controller; and computing a relative motion between the imager and the imaged scene based on an optical flow between successive image frames, wherein the optical flow includes a pattern of apparent motion of objects of interest in the successive image frames.
These and other features, advantages, and objects of the present invention will be further understood and appreciated by those skilled in the art by reference to the following specification, claims, and appended drawings.
The present invention will be become more fully understood from the detailed description and the accompanying drawings, wherein:
Reference will now be made in detail to the present preferred embodiments of the invention, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numerals will be used throughout the drawings to refer to the same or like parts. In the drawings, the depicted structural elements are not to scale and certain components are enlarged relative to the other components for purposes of emphasis and understanding.
The embodiments described herein relate to an imaging system for a vehicle that may be used to detect and optionally categorize objects in a scene forward of the vehicle. To assist in the understanding of an application of these embodiments, examples are provided that pertain to the use of the imaging system in an exterior light control system for controlling exterior lights of a controlled vehicle in response to image data acquired from an image sensor, which captures images forward of the vehicle. Auto High Beam (AHB) and alternate methods of controlling the light beam illumination in front of a motor vehicle maximizes the use of high beams at night by identifying oncoming and preceding vehicles and automatically controlling the high beam lighting pattern. This prevents glare to other vehicles, yet maintains a high beam light distribution to illuminate areas not occupied by other vehicles. Prior systems are known for controlling exterior vehicle lights in response to images captured forward of the vehicle. In these prior systems, a controller would analyze the captured images and determine if any preceding or oncoming vehicles were present in a glare area in front of the vehicle employing the system. This “glare area” was the area in which the exterior lights would cause excessive glare to a driver if the exterior lights were in a high beam state (or some state other than a low beam state). If a vehicle was present in the glare area, the controller would respond by changing the state of the exterior lights so as to not cause glare for the other driver(s). Examples of such systems are described in U.S. Pat. Nos. 5,837,994, 5,990,469, 6,008,486, 6,049,171, 6,130,421, 6,130,448, 6,166,698, 6,255,639, 6,379,013, 6,403,942, 6,587,573, 6,593,698, 6,611,610, 6,631,316, 6,653,614, 6,728,393, 6,774,988, 6,861,809, 6,906,467, 6,947,577, 7,321,112, 7,417,221, 7,565,006, 7,567,291, 7,653,215, 7,683,326, 7,881,839, 8,045,760, 8,120,652, and 8,543,254, the entire disclosures of which are incorporated herein by reference.
In some of the prior systems using AHB or alternative methods of controlling the light beam illumination in front of a motor vehicle, an imaging system would image a forward scene and the controller would analyze the captured images to detect whether the vehicle was in or entering a village (or town) that is sufficiently lighted. The controller would then typically either place the exterior lights in a low beam state or otherwise inhibit operation of high beam headlights. The high beams or alternate beam illumination are then reactivated when the village area is exited. Various methods are used including detecting streetlights or measuring the ambient brightness level when entering a village to determine whether to activate or re-activate the high beam headlights. Examples of such systems are described in U.S. Pat. Nos. 6,861,809, 7,565,006, and 8,045,760, and also in United States Patent Application Publication No. US 20130320193 A1, the entire disclosures of which are incorporated herein by reference.
The aforementioned prior systems illustrate just a few ways in which the exterior lights of a controlled vehicle may be controlled in response to changing driving conditions. Oftentimes, proper operation of these and other similar systems requires accurate detection of one or more objects of interest in an imaged scene. Depending on the application, these objects of interest may be stationary objects such as streetlights, lane markers, signs, and/or moving objects such as the headlights or taillights of other travelling vehicles. Proper detection of objects of interest may be affected if the motion of an imaging system relative to the imaged scene, referred to herein as “ego motion,” is unknown. During routine driving situations, there are many common conditions that may alter the motion of an imaging system relative to the imaged scene, such as bumpy roads, sudden turns, inclines/declines, etc. These conditions may cause stationary objects of interest to have apparent motion in successive image frames. Thus, if the ego motion of the imaging system is not accounted for, it may be difficult for an imaging system to perform various imaging operations since the system may be unable to determine which objects are actually moving and which objects only appear to be moving as a result of the imaging system's ego motion, and to a similar extent, the controlled vehicle's ego motion. Thus, in light of the above, an imaging system of a controlled vehicle is advantageously provided herein and is configured to analyze the optical flow between successive image frames to estimate the ego motion of its imaging system in order to correct for the apparent motion of imaged objects. As used herein, “optical flow” is defined as the pattern of apparent motion of objects of interest in successive image frames caused by the relative motion between the imaging system and the scene being imaged.
A first embodiment of an imaging system 10 is shown in
If imaging system 10 is used in a vehicle equipment control system, controller 30 may be configured to directly connect to the equipment (50) being controlled such that the generated control signals directly control the equipment. Alternatively, controller 30 may be configured to connect to an equipment control (60 and 70), which, in turn, is connected to the equipment being controlled (62 and 80) such that the control signals generated by controller 30 only indirectly control the equipment. For example, in the case of the equipment being exterior lights 80, controller 30 may analyze the image data from imager 20 so as to generate control signals that are more of a recommendation for an exterior light control 70 to use when controlling exterior lights 80. Thus, it can be said that the control signals are used to control the equipment. The control signals may further include not just a recommendation, but also a code representing a reason for the recommendation so that equipment controls (60 and 70) may determine whether or not to override a recommendation.
As shown in
According to one embodiment, the equipment that system 10 can control may include one or more exterior lights 80 and the control signal generated by controller 30 may be an exterior light control signal. In this embodiment, exterior lights 80 may be controlled directly by controller 30 or by an exterior light control 70, which receives a control signal from controller 30. As used herein, the “exterior lights” broadly includes any exterior lighting on the vehicle. Such exterior lights may include headlights (both low and high beam if separate from one another), taillights, foul weather lights such as fog lights, brake lights, center-mounted stop lights (CHMSLs), turn signals, back-up lights, etc. The exterior lights may be operated in several different modes including conventional low beam and high beam states. They may also be operated as daytime running lights, and additionally as super-bright high beams in those countries where they are permitted.
The exterior light brightness may also be continuously varied between the low, high, and super-high states. Separate lights may be provided for obtaining each of these exterior lighting states or the actual brightness of the exterior lights may be varied to provide these different exterior lighting states. In either case, the “perceived brightness” or illumination pattern of the exterior lights is varied. As used herein, the term “perceived brightness” means the brightness of the exterior lights as perceived by an observer outside the vehicle. Most typically, such observers will be drivers or passengers in a preceding vehicle or in a vehicle traveling along the same street in the opposite direction. Ideally, the exterior lights are controlled such that if an observer is located in a vehicle within a “glare area” relative to the vehicle (i.e., the area in which the observer would perceive the brightness of the exterior lights as causing excessive glare), the beam illumination pattern is varied such that the observer is no longer in the glare area. The perceived brightness and/or glare area of the exterior lights may be varied by changing the illumination output of one or more exterior lights, by steering one or more lights to change the aim of one or more of the exterior lights, selectively blocking or otherwise activating or deactivating some or all of the exterior lights, altering the illumination pattern forward of the vehicle, or a combination of the above.
Imager 20 may be any conventional imager. Examples of suitable imagers are disclosed in published United States Patent Application Publication Nos. US 20080192132 A1 and US 20120072080 A1, and in U.S. Provisional Application Nos. 61/500,418 entitled “MEDIAN FILTER” filed on Jun. 23, 2011, by Jon H. Bechtel et al.; 61/544,315 entitled “MEDIAN FILTER” and filed on Oct. 7, 2011, by Jon H. Bechtel et al.; 61/556,864 entitled “HIGH DYNAMIC RANGE CAMERA LOW LIGHT LEVEL FILTERING” filed on Nov. 8, 2011, by Jon H. Bechtel et al., the entire disclosures of which are incorporated herein by reference.
The imaging system 10 may include an image sensor (201,
In the example shown in
Controller 30 can also take advantage of the availability of signals (such as vehicle speed, steering wheel angle, pitch, roll, and yaw) communicated via discreet connections or over the vehicle bus 25 in making decisions regarding the operation of the exterior lights 80. In particular, speed input 21 provides vehicle speed information to the controller 30 from which speed can be a factor in determining the control state for the exterior lights 80 or other equipment. The reverse signal 22 informs controller 30 that the vehicle is in reverse, responsive to which the controller 30 may clear an electrochromic mirror element regardless of signals output from light sensors. Auto ON/OFF switch input 23 is connected to a switch having two states to dictate to controller 30 whether the vehicle exterior lights 80 should be automatically or manually controlled. The auto ON/OFF switch (not shown) connected to the ON/OFF switch input 23 may be incorporated with the headlight switches that are traditionally mounted on the vehicle dashboard or incorporated into steering wheel column levels. Manual dimmer switch input 24 is connected to a manually actuated switch (not shown) to provide a manual override signal for an exterior light control state. Some or all of the inputs 21, 22, 23, 24 and outputs 42a, 42b, and 42c, as well as any other possible inputs or outputs, such as a steering wheel input, can optionally be provided through vehicle bus 25 shown in
Controller 30 can control, at least in part, other equipment 50 within the vehicle, which is connected to controller 30 via vehicle bus 42. Specifically, the following are some examples of one or more equipment 50 that may be controlled by controller 30: exterior lights 80, a rain sensor, a compass, information displays, windshield wipers, a heater, a defroster, a defogger, an air conditioning system, a telephone system, a navigation system, a security system, a tire pressure monitoring system, a garage door opening transmitter, a remote keyless entry system, a telematics system, a voice recognition system such as a digital signal processor based voice actuation system, a vehicle speed control, interior lights, rearview mirrors, an audio system, an engine control system, and various other switches and other display devices that may be located throughout the vehicle.
In addition, controller 30 may be, at least in part, located within a rearview assembly of a vehicle or located elsewhere within the vehicle. The controller 30 may also use a second controller (or controllers), equipment control 60, which may be located in a rearview assembly or elsewhere in the vehicle in order to control certain kinds of equipment 62. Equipment control 60 can be connected to receive via vehicle bus 42 control signals generated by controller 30. Equipment control 60 subsequently communicates and controls equipment 62 via bus 61. For example, equipment control 60 may be a windshield wiper control unit which controls windshield wiper equipment, turning this equipment ON or OFF. Equipment control may also be an electrochromic mirror control unit where controller 30 is programmed to communicate with the electrochromic control unit in order for the electrochromic control unit to change the reflectivity of the electrochromic mirror(s) in response to information obtained from an ambient light sensor, a glare sensor, as well as any other components coupled to the processor. Specifically, equipment control unit 60 in communication with controller 30 may control the following equipment: exterior lights, a rain sensor, a compass, information displays, windshield wipers, a heater, a defroster, a defogger, air conditioning, a telephone system, a navigation system, a security system, a tire pressure monitoring system, a garage door opening transmitter, a remote keyless entry, a telemetry system, a voice recognition system such as a digital signal processor-based voice actuation systems, a vehicle speed, interior lights, rearview mirrors, an audio system, a climate control, an engine control, and various other switches and other display devices that may be located throughout the vehicle.
Portions of system 10 can be advantageously integrated into a rearview assembly 200 as illustrated in
Referring to
Controller 30 of
Rearview assembly 200 may include a mirror element or a display that displays a rearward view. The mirror element may be a prismatic element or an electro-optic element, such as an electrochromic element.
Additional details of the manner by which system 10 may be integrated into a rearview mirror assembly 200 are described in U.S. Pat. No. 6,611,610, the entire disclosure of which is incorporated herein by reference. Alternative rearview mirror assembly constructions used to implement imaging systems are disclosed in U.S. Pat. No. 6,587,573, the entire disclosure of which is incorporated herein by reference.
A method for computing and correcting for ego motion will now be described and may be used with the previously described imaging system 10. For purposes of illustration, the method is described below as being implemented by controller 30 using image data received from imager 20. The method may be a subroutine executed by any processor, and thus the method may be embodied in a non-transitory computer readable medium having stored thereon software instructions that, when executed by a processor, cause the processor to control the equipment of the controlled vehicle, by executing the steps of the method described below. In other words, aspects of the inventive method may be achieved by software stored on a non-transitory computer readable medium or software modifications or updates to existing software residing in a non-transitory computer readable medium. Such software or software updates may be downloaded into a first non-transitory computer readable media 32 of controller 30 (or locally associated with controller 30 or some other processor) typically prior to being installed in a vehicle, from a second non-transitory computer readable media 90 located remote from first non-transitory computer readable media 32 (See
According to one implementation, the method for computing the ego motion of the imaging system 10 includes computing a relative motion between the imager 20 and the imaged scene in both a horizontal X direction and a vertical Y direction, which will now be described in greater detail with reference to
Discussion first turns to step 1100, from which the controller 30 proceeds to steps 1300 and 1400, which may be performed in parallel. In step 1300, the controller 30 computes a vertical position value, which is based on a change in vertical position for a number of detected objects of interest appearing in successive image frames and will be described in further detail in reference to
Referring back to step 1400, the controller 30 computes a vertical motion value. The vertical motion value is based on a change in vertical position for only those detected objects of interest appearing in successive image frames and having a common apparent motion in the vertical direction. The computation of the vertical motion value will be described in further detail with reference to
Having completed steps 1300 and 1400, the controller 30 computes a weighted average between the vertical position value and the vertical motion value in step 2200. The weighted average indicates the relative motion between the imager 20 and the imaged scene in the vertical direction. Accordingly, the weighted average may be used to correct for apparent motion caused by the ego motion of imager 20 in the vertical direction. Once the correction has been made, the controller 30 ends the Ego Y process at step 2400 and may return back to step 1100 to repeat the Ego Y process so long as objects of interest are present in subsequent image frames.
Discussion now turns to the Ego X process, which begins at step 1200. In step 2500, the controller 30 obtains and transforms a yaw signal of the vehicle to the image domain. Based on the transformed yaw signal, the controller 30 computes a vehicle heading in step 2600. The controller 30 then takes a time average of the vehicle heading in step 2700, which indicates the relative motion between the imager 20 and the imaged scene in the horizontal direction and may be used accordingly in step 2800 to correct for apparent motion in the horizontal direction caused by the ego motion of the camera 20. Once the correction has been made, the controller 30 ends the Ego X process at step 2400. The controller 30 may then return back to step 1200 to repeat the Ego X process so long as objects of interest are present in subsequent image frames.
The above description is considered that of the preferred embodiments only. Modifications of the invention will occur to those skilled in the art and to those who make or use the invention. Therefore, it is understood that the embodiments shown in the drawings and described above are merely for illustrative purposes and not intended to limit the scope of the invention, which is defined by the claims as interpreted according to the principles of patent law, including the doctrine of equivalents.
Wright, David J., Falb, David M.
Patent | Priority | Assignee | Title |
10354160, | Jul 25 2014 | Denso Corporation | Pedestrian detection device and pedestrian detection method |
10594940, | Jan 12 2018 | Vulcan Inc.; VULCAN INC | Reduction of temporal and spatial jitter in high-precision motion quantification systems |
10872400, | Nov 28 2018 | Vulcan Inc. | Spectral selection and transformation of image frames |
11044404, | Nov 28 2018 | VULCAN INC | High-precision detection of homogeneous object activity in a sequence of images |
11400857, | Aug 05 2020 | Ford Global Technologies, LLC | Method for operating a high-beam assistance system of a motor vehicle |
11485278, | Mar 18 2020 | Grote Industries, LLC | System and method for adaptive driving beam headlamp |
11760254, | Mar 18 2020 | Grote Industries, LLC | System and method for adaptive driving beam headlamp |
Patent | Priority | Assignee | Title |
20040119818, | |||
20090252377, | |||
20100073480, | |||
20120236151, | |||
20140168431, | |||
20140198184, | |||
CN102999759, | |||
JP2003150941, | |||
JP2010093610, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 18 2014 | WRIGHT, DAVID J | Gentex Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 033563 | /0084 | |
Aug 18 2014 | FALB, DAVID M | Gentex Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 033563 | /0084 | |
Aug 19 2014 | Gentex Corporation | (assignment on the face of the patent) | / | |||
Aug 29 2023 | Gentex Corporation | HL KLEMOVE CORPORATION | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 065696 | /0445 |
Date | Maintenance Fee Events |
Jan 21 2021 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Aug 01 2020 | 4 years fee payment window open |
Feb 01 2021 | 6 months grace period start (w surcharge) |
Aug 01 2021 | patent expiry (for year 4) |
Aug 01 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 01 2024 | 8 years fee payment window open |
Feb 01 2025 | 6 months grace period start (w surcharge) |
Aug 01 2025 | patent expiry (for year 8) |
Aug 01 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 01 2028 | 12 years fee payment window open |
Feb 01 2029 | 6 months grace period start (w surcharge) |
Aug 01 2029 | patent expiry (for year 12) |
Aug 01 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |