An apparatus for controlling an airbag module in a vehicle cabin is provided that includes a control apparatus for triggering the airbag module, a plenoptic camera configured as a camera module that generates image data having depth information within a prespecifiable region in the vehicle cabin with the prespecifiable region comprising at least a first and a second partial region of a trigger region of the airbag module, and an evaluation device which determines on the basis of the image data whether an object, in particular a vehicle occupant, is located in the first or the second partial region and the control apparatus controls the airbag module in dependence on the position of the object.
|
10. An airbag deployment control apparatus of a vehicle, comprising:
a single plenoptic camera capturing image data having depth information within a prespecifiable region of a vehicle cabin with the prespecifiable region comprising at least a first partial region, a second partial region, and a third partial region;
an airbag module including a trigger for deploying an airbag;
a control apparatus;
an evaluation device processing the image data to determine whether an occupant of the vehicle is located in the first partial region, the second partial region, or the third partial region at a time of impact,
wherein the control apparatus controls the airbag module such that
if the occupant is in the first partial region at the time of impact, the airbag module is triggered,
if the occupant is in the third partial region at the time of impact, the airbag module is not triggered,
wherein the evaluation device calculates if a head of the occupant will be located in the first partial region, the second partial region or the third partial region at a later time after the time of impact, and
wherein if the head of the occupant is in the second partial region, and the evaluation device calculates that the head of the occupant will be in the third partial region at the later time after the time of impact, then the airbag is not triggered.
1. An apparatus for controlling an airbag module in a vehicle cabin, the apparatus comprising:
a control apparatus connected to a trigger of the airbag module;
a plenoptic camera of a camera module, which generates image data having depth information within a prespecifiable region in the vehicle cabin with the prespecifiable region comprising at least a first partial region, a second partial region, and a third partial region of a trigger region of the airbag module; and
an evaluation device processing the image data to determine whether an object is located in the first partial region, the second partial region, or the third partial region,
wherein the control apparatus controls the airbag module such that
if the evaluation device determines the object is in the second partial region at the time of impact, the control apparatus partially deploys the airbag module,
if the evaluation device determines the object is in the third partial region at the time of impact, the control apparatus does not trigger the airbag module,
wherein the evaluation device calculates if the object will be located in the first partial region, the second partial region, or the third partial region at a later time after the time of impact, such that
if the object is located in the first partial region at the time of impact, and the evaluation device calculates that, at the later time after impact, the object will be in the second partial region, then the airbag module is partially deployed by the control apparatus.
2. The apparatus as claimed in
an objective that images objects in the prespecifiable region in an image plane of the camera module;
a sensor generating image data; and
a microlens array arranged between the sensor and the objective, wherein each microlens of the microlens array images the objects, which are imaged by the objective on the sensor.
4. The apparatus as claimed in
5. The apparatus as claimed in
6. The apparatus as claimed in
7. The apparatus as claimed in
8. The apparatus as claimed in
9. The apparatus as claimed in
11. The airbag deployment control apparatus of a vehicle of
12. The airbag deployment control apparatus of a vehicle of
13. The airbag deployment control apparatus of a vehicle of
14. The apparatus as claimed in
15. The apparatus as claimed in
16. The apparatus as claimed in
17. The apparatus as claimed in
|
This nonprovisional application claims priority to German Patent Application No. 10 2013 005 039.1, which was filed in Germany on Mar. 25, 2013, and to U.S. Provisional Application No. 61/804,974, which was filed on Mar. 25, 2013, and which are both herein incorporated by reference.
Field of the Invention
The present invention relates to an apparatus for controlling an airbag module.
Description of the Background Art
Motor vehicles equipped with safety systems, for example airbag modules, are known from the prior art. The airbag modules serve to minimize injuries to occupants in the event of an accident of the motor vehicle.
However, should it be the case that the occupant is located near the exit of the airbag module, there is a risk that the airbag module itself will injure the occupant. In this context, studies have shown that it is children in particular which can be injured severely by an airbag module. One study by the ADAC, for example, has found that small children sitting in an infant seat sustain presumably serious head and neck injuries caused by the passenger airbag. This is in fact only because of the force of impact of the airbag module. Even a minor accident, at low speed, can lead to grave or fatal injuries. The ADAC study furthermore found that bigger children that sit too close to the dashboard are struck by the deploying airbag. This can result in broken bones, grazes and burns. Even more severe injuries can occur if the legs of the child are located in the deployment area of the airbag module.
In order to avoid these cases, various solutions are already known from the prior art. For example, it is known to switch off the airbag module in various driving situations.
DE 197 24 344 C1, which corresponds to U.S. Pat. No. 6,164,693, discloses that a manually operable switch with at least one switch position is provided in a motor vehicle, which switch is assigned to an occupation type for which the passenger airbag must not be activated, and that a control unit is present which deactivates the passenger airbag if the switch is in said switching position, and that means are provided which, after starting the motor, prevent the vehicle from moving if the switch was not activated. In order to rule out forgetting of a switch actuation, it is ensured that the vehicle can only be made to move when the switch has been activated. DE 44 10 402 A1 discloses a hand switch with which the airbag of a seat can be switched on and off.
It is additionally also known to monitor an occupant area using a plurality of camera modules, and to trigger the airbag module in dependence of a position of the occupant. However, such occupant safety systems are complicated to manage and relatively expensive to produce because of the many individual components, in particular because of the plurality of camera modules.
It is therefore an object of the invention to provide an apparatus for controlling an airbag module which has few individual parts.
In an embodiment of the invention, an apparatus for controlling an airbag module in a vehicle cabin has the following features: a control apparatus for triggering the airbag module; a plenoptic camera configured as a camera module, which generates image data having depth information within a prespecifiable region in the vehicle cabin, with the prespecifiable region comprising at least a first and a second partial region of a trigger region of the airbag module; an evaluation device which determines on the basis of the image data whether an object, in particular a vehicle occupant, is located in the first or second partial region and the control apparatus controls the airbag module in dependence on the position of the object. Using what is known as a plenoptic camera according to the invention, it is possible to obtain depth information of the region to be measured with exactly one camera module. Further camera modules are therefore no longer necessary. According to the invention, the camera module, the evaluation device, the control apparatus and the airbag module are connected to one another such that they communicate at least electrically.
The technical feature “depth information” according to the invention can be understood to mean information which contains both the direction and the distance of the image data generated relating to objects within the prespecifiable region. The technical feature “prespecifiable region” according to the invention can be understood to mean the image angle of the plenoptic camera. For a desired region, for example the front region of a vehicle cabin, it is thus possible using the image data to calculate exactly where and in which position an object is located within said region. The partial regions of the trigger region of the airbag module located within the prespecifiable region, that is to say the first partial region, the second partial region, and any further partial regions, can be selected freely depending on the vehicle type and the arrangement of airbag module and any seat position of a vehicle occupant.
In an embodiment, the camera module furthermore comprises the following features: an objective which images objects in the prespecifiable region in an image plane of the camera module; a sensor generating image data; a microlens array which is arranged between the sensor and the objective, wherein each microlens in the microlens array images the objects, which are imaged by the objective, on the sensor.
In a further embodiment, the microlens array can be arranged in the image plane. In a further preferred embodiment, the microlens array is arranged between the image plane and the sensor. In a further preferred embodiment, a further lens is provided which is arranged between the sensor and the microlens array, wherein the further lens images the objects, which are imaged in an intermediate image plane by each individual microlens of the microlens array, on the sensor.
In a further embodiment, the control apparatus triggers the airbag module if the object is located in the first partial region, and the airbag module is not triggered by the control apparatus if the object is located in the second partial region. This therefore takes account of the fact that the airbag module must not damage or injure any occupant or object located in the second partial region.
A further embodiment is characterized in that the evaluation device is configured such that it detects whether a child seat including a child sitting on it is arranged in the prespecifiable region. In this case, the control apparatus does not trigger or only partially triggers the airbag module in dependence on the size of the child. According to the invention, this takes account of the fact that children are particularly at risk from a triggered airbag module.
In a further embodiment, the trigger region of the airbag module comprises a first, a second and a third partial region, and the airbag module is triggered if the object is located in the first partial region, the airbag module is triggered partially if the object is located in the second partial region, and the airbag module is not triggered if the object is located in the third partial region. If parts of the object are located in the first and/or second partial region, and parts of the object are located in the third partial region, the invention makes provision for the control apparatus not to trigger the airbag module. This case may occur, for example, if the legs are located near the airbag module, in particular in an exit region of the airbag module, that is to say in the third partial region, while torso and head are located in the first and/or second partial region. This is because particularly grave injuries may be caused by what is known as the “jack-knife effect.”
In a further embodiment, the object is an occupant, in particular the torso and/or the head and/or the legs of the occupant. In a further preferred embodiment, the evaluation device comprises a face detection module or other body detection modules. The evaluation device is thus able to detect, using the generated image data, whether an occupant is located in the prespecifiable region or not, and to furthermore derive from the image data in which partial regions of the trigger region the occupant or specific body parts of the occupants is/are located.
In a further embodiment, the sensor is configured as an infrared sensor, in particular as what is known as an LWIR sensor in a wavelength range of from 8 to 15 μm. Such an embodiment is particularly advantageous since external light sources illuminating the prespecifiable region can be omitted.
In a further embodiment, the evaluation device calculates, using sensor data, whether the object will, at a later time when the airbag module has already been triggered, be located in the first, second, or third partial region. In this embodiment, the airbag module is triggered if the object will be located in the first partial region at said later time, the airbag module is partially triggered if the object will be located in the second partial region at said later time, and the airbag module is not triggered if the object will be located in the third partial region at said later time. The sensors may preferably be acceleration sensors.
Further scope of applicability of the present invention will become apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific examples, while indicating preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The present invention will become more fully understood from the detailed description given hereinbelow and the accompanying drawings which are given by way of illustration only, and thus, are not limitive of the present invention, and wherein:
In an exemplary embodiment, a plenoptic camera is configured as a camera module 5 and the function of which will be explained later in more detail with reference to
In this front region, two vehicle seats 6 are arranged, with the side view according to
The plenoptic camera, which is configured as the camera module 5, is connected to an evaluation device 11, which processes image data generated by the camera module 5 and evaluates in which of the three partial regions 8a, 8b or 8c the occupant, or preferably the head or the torso of the occupant, is located. Possible positions of the head 7b of the occupant in specific driving situations in the three partial regions are correspondingly marked by dashed lines. Based on the evaluation by the evaluation device 11, a control apparatus 3, which is configured for example as a microcomputer, triggers the airbag module 2 if the torso or the head is located in the first partial region 8c. The airbag module is triggered only partially if the torso or the head is located in the second partial region 8b, whereas the airbag module is not triggered if the torso or the head is located in the third partial region 8a. A person skilled in the art will know that it may be specified to the control apparatus under which conditions (head and/or torso or other body parts) the airbag module is triggered or not triggered. To this end, the evaluation device can additionally comprise a face detection module so as to be able to derive uniquely from the generated image data in which of the partial regions the head is located. Other body detection modules are also conceivable.
The apparatus for controlling the airbag module furthermore comprises a crash sensor unit 4, which determines whether the motor vehicle has undergone an impact at all. Said crash sensor unit 4 is connected to the control apparatus 3 such that they electrically communicate with one another. The crash sensor unit 4 can be, for example, one or more acceleration sensors that can be integrated in the control apparatus or distributed within the motor vehicle. The crash sensor unit can, for example, be two acceleration sensors. The airbag module is triggered only if both sensors, independently of one another, indicate a corresponding delay, or in other words: it is an essential precondition for triggering the airbag module independently of in which partial region the object is located that an impact of the motor vehicle is determined by the crash sensor unit.
The mode of function of the plenoptic camera, which is configured as the camera module, will now be explained in more detail with reference to
Since each individual pixel under a microlens records a parallax, it is possible to generate individual images through the decomposition and joining of comparable pixels, from which ultimately the depth information according to the invention can be calculated using the evaluation device. The microlens array can be located according to
It is thus possible using the plenoptic camera to detect predefined objects 7 in a depth-resolved manner in a prespecifiable region and to determine in which partial region of the trigger region of the airbag module, which is located within the prespecifiable region, said objects are located and to decide, using said evaluation, whether the airbag module is to be triggered or not by the control apparatus 3.
Overall, the apparatus according to the invention makes available a compact device for controlling an airbag module which can omit a plurality of individual components on account of a plenoptic camera being used.
The invention being thus described, it will be obvious that the same may be varied in many ways. Such variations are not to be regarded as a departure from the spirit and scope of the invention, and all such modifications as would be obvious to one skilled in the art are to be included within the scope of the following claims.
Patent | Priority | Assignee | Title |
10692192, | Oct 21 2014 | CONNAUGHT ELECTRONICS LTD | Method for providing image data from a camera system, camera system and motor vehicle |
Patent | Priority | Assignee | Title |
6164693, | Jun 10 1997 | Robert Bosch GmbH | Device for detecting the type of occupancy of the passenger's seat of a motor vehicle |
7227626, | May 13 2003 | Continental Automotive GmbH | Method for determining the current position of the heads of vehicle occupants |
7403635, | Sep 11 2002 | Siemens Aktiengesellschaft | Device and method for detection of an object or a person in the interior of a motor vehicle |
7607509, | Apr 19 2002 | IEE INTERNATIONAL ELECTRONICS & ENGINEERING S A | Safety device for a vehicle |
20020195806, | |||
20030040859, | |||
20060023918, | |||
20060120565, | |||
20070280505, | |||
20100141802, | |||
20130044254, | |||
DE102011053999, | |||
DE102011114325, | |||
DE102012016160, | |||
DE10241993, | |||
DE10308405, | |||
DE10321506, | |||
DE19724344, | |||
DE4410402, | |||
WO240320, | |||
WO3089277, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 25 2014 | JENOPTIK Optical Systems GmbH | (assignment on the face of the patent) | / | |||
Apr 22 2014 | HIMEL, MARC | JENOPTIK Optical Systems GmbH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032787 | /0528 |
Date | Maintenance Fee Events |
Sep 28 2020 | REM: Maintenance Fee Reminder Mailed. |
Mar 15 2021 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 07 2020 | 4 years fee payment window open |
Aug 07 2020 | 6 months grace period start (w surcharge) |
Feb 07 2021 | patent expiry (for year 4) |
Feb 07 2023 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 07 2024 | 8 years fee payment window open |
Aug 07 2024 | 6 months grace period start (w surcharge) |
Feb 07 2025 | patent expiry (for year 8) |
Feb 07 2027 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 07 2028 | 12 years fee payment window open |
Aug 07 2028 | 6 months grace period start (w surcharge) |
Feb 07 2029 | patent expiry (for year 12) |
Feb 07 2031 | 2 years to revive unintentionally abandoned end. (for year 12) |