A vehicle has a restraint system with at least two restraint devices. A sensor detects a potential collision object and outputs a signal representing a relative position of the potential collision object to the vehicle. A processing device suppresses deployment of at least one of the restraint devices based at least in part on the signal output by the at least one sensor.

Patent
   9266487
Priority
Mar 17 2014
Filed
Mar 17 2014
Issued
Feb 23 2016
Expiry
Mar 17 2034
Assg.orig
Entity
Large
5
11
currently ok
12. A vehicle system comprising:
at least one sensor configured to detect a potential collision object external to a host vehicle and output a signal representing a relative position of the potential collision object; and
a processing device configured to determine an initial seat configuration and suppress deployment of at least one of a plurality of restraint devices based at least in part on the signal output by the at least one sensor and the initial seat configuration, and
wherein the processing device is programmed to suppress deployment of at least another of the plurality of restraint devices based at least in part on the signal output by the at least one sensor and in response to detecting a new seat configuration during autonomous operation of the host vehicle,
wherein detecting the initial seat configuration includes detecting at least one seat facing a first direction and wherein detecting the new seat configuration includes detecting the at least one seat facing a second direction.
1. A vehicle system comprising:
a restraint system having at least two restraint devices;
at least one collision sensor programmed to detect a potential collision object external to a host vehicle and output a signal representing a relative position of the potential collision object;
at least one seat sensor programmed to detect an initial seat configuration and output a signal representing the initial seat configuration and wherein the at least one sensor is programmed to detect a new seat configuration and output a signal representing the new seat configuration prior to the at least one collision sensor detecting the potential collision object; and
a processing device programmed to suppress deployment of at least one of the restraint devices based at least in part on the signals output by the at least one collision sensor and the at least one seat sensor,
wherein detecting the initial seat configuration includes detecting at least one seat facing a first direction while the host vehicle is operating in an autonomous mode and wherein detecting the new seat configuration includes detecting the at least one seat facing a second direction while the host vehicle is operating in the autonomous mode.
2. The vehicle system of claim 1, wherein the signal output by the at least one collision sensor indicates a principal direction of force that would be applied to the host vehicle if the potential collision object were to collide with the host vehicle.
3. The vehicle system of claim 1, wherein the at least one seat is configured to move from a first position facing the first direction to a second position facing the second direction, and vice versa, during autonomous operation of the host vehicle.
4. The vehicle system of claim 3, wherein the processing device is further configured to suppress deployment of at least one of the restraint devices based on whether the seat is in the first position or the second position.
5. The vehicle system of claim 3, wherein the first position includes a front-facing position and wherein the second position includes a rear-facing position.
6. The vehicle system of claim 3, wherein the at least two restraint devices includes a first restraint device and a second restraint device, and wherein the processing device is configured to suppress deployment of the first restraint device if the seat is in the second position.
7. The vehicle system of claim 1, wherein the at least two restraint devices includes at least two airbags.
8. The vehicle system of claim 1, wherein the relative position of the potential collision object includes a distance of the potential collision object relative to the host vehicle.
9. The vehicle system of claim 1, wherein the relative position of the potential collision object includes a direction of the potential collision object relative to the host vehicle.
10. The vehicle system of claim 1, wherein the collision sensor includes a communication module configured to receive communication signals from the potential collision object.
11. The vehicle system of claim 10, wherein the processing device is configured to determine the relative position of the potential collision object based at least in part on the communication signal.

Vehicles include restraint systems to reduce injuries to vehicle occupants during a collision. Restraint systems are not always deployed, however. For example, certain restraint systems are disabled unless a passenger is detected in a seat and the passenger is over a certain size or weight. Existing restraint systems, however, assume that all seats in the vehicle, and in particular the front seats, are always forward facing. Therefore, existing restraint systems may not be adequate for autonomous vehicles or other types of vehicles that could have dynamic seat configurations.

FIGS. 1A and 1B illustrate an exemplary vehicle configured to adaptively suppress a restraint system.

FIG. 2 is a block diagram of an exemplary system that may be incorporated into the vehicle of FIGS. 1A and 1B.

FIG. 3 is a flowchart of an exemplary process that may be used to adaptively suppress the restraint system.

An exemplary vehicle has a restraint system with at least two restraint devices. A sensor detects a potential collision object and outputs a signal representing a relative position of the potential collision object to the vehicle. A processing device suppresses deployment of at least one of the restraint devices based at least in part on the signal output by the at least one sensor. Moreover, certain restraint devices can be selectively disabled based on the seat configuration. Accordingly, only those restraint devices (e.g., airbags) that will be able to assist during a vehicle collision may be enabled.

The system shown in the FIGS. may take many different forms and include multiple and/or alternate components and facilities. While an exemplary system is shown, the exemplary components illustrated are not intended to be limiting. Indeed, additional or alternative components and/or implementations may be used.

As illustrated in FIGS. 1A and 1B, the vehicle 100 includes reconfigurable seats 105, including front seats 105A and a rear seat 105B, and multiple restraint devices 110, including a pair of first restraint devices 110A and a second restraint device 110B. The seats 105 may be located in a passenger compartment 115, and at least some of the seats 105 may be configured to move from a front-facing position to a rear-facing position. FIG. 1A shows all seats 105 in the front-facing position and FIG. 1B shows the front seats 105A in the rear-facing position. The restraint devices 110 may include any device configured to cushion an occupant during a collision. Therefore, examples of restraint devices 110 may include airbag, seat 105 belts, or the like. The vehicle 100 may further include a system 120 for detecting a potential collision object and actively suppressing at least one of the restraint devices 110 based on a configuration of the seats 105 and the potential direction of impact or force that would be applied to the vehicle 100 if the collision object were to collide with the vehicle 100. Although illustrated as a sedan, the vehicle 100 may include any passenger or commercial vehicle such as a car, a truck, a sport utility vehicle, a taxi, a bus, etc. In some implementations, the vehicle 100 is an autonomous (e.g., self-driving) or partially autonomous vehicle.

FIG. 2 is a block diagram of an exemplary system 120 for detecting a potential collision object and actively suppressing one or more restraint devices 110. The system 120, as shown in FIG. 2, includes a user interface device 125, at least one sensor 130, and a processing device 135.

The user interface device 125 may be configured to present information to and/or receive inputs from a user, such as a driver, during operation of the vehicle 100. Thus, the user interface device 125 may be located in the passenger compartment 115 of the vehicle 100. In some possible approaches, the user interface device 125 may include a touch-sensitive display screen.

The sensors 130 may include any number of devices configured to generate signals that help during operation of the vehicle 100. For instance, in the instance of an autonomous vehicle 100, that is, a vehicle 100 capable of operating in an autonomous (e.g., driverless) or partially autonomous mode, the sensors 130 may include a radar sensor, a radar sensor, a camera, or the like. The sensors 130 may help the vehicle 100 “see” the roadway and/or negotiate various obstacles while the vehicle 100 is operating in the autonomous mode. Even when operating in a non-autonomous mode, the sensors 130 may be configured to help identify the presence and location of potential collision objects. Examples of collision objects may include other vehicles, infrastructure devices such as bridges, walls, and other inanimate objects, traffic control devices such as signs, debris in the roadway, etc. Upon detecting a potential collision object, the sensor 130 may be configured to output a signal representing a relative position of the potential collision object. The relative position may be, e.g., the position of the potential collision object relative to the vehicle 100, which may include a distance between the potential collision object and the vehicle 100 as well as a direction or angle of the potential collision object relative to the vehicle 100. With the relative position, the processing device 135 can determine a possible or likely direction of impact or force if the potential collision object were to collide with the vehicle 100.

Another type of sensor 130 included in the vehicle 100 may include a communication module 140. The communication module 140 may be configured to transmit messages to and/or receive messages from other vehicles 100. The communication module 140 may be configured to communicate in accordance with any number of communication protocols. For example, the communication module 140 may be configured to communicate according to one or more of a Dedicated Short-Range Communication (DSRC) protocol, the Bluetooth® protocol, WiFi, or the like. The communication module 140 may be configured to receive communication signals from, e.g., the potential collision object. Examples of communication signals may include a signal that indicates a position of the potential collision object. The processing device 135 may determine the relative position of the potential collision object from the position sent via the communication signal. Although shown as one of the sensors 130, the communication module 140 may alternatively be separate from the sensors 130.

In the context of autonomous vehicles 100, the processing device 135 may be configured to control one or more subsystems 145 while the vehicle 100 is operating in the autonomous mode. Examples of subsystems 145 that may be controlled by the processing device 135 may include a brake subsystem, a suspension subsystem, a steering subsystem, and a powertrain subsystem. The processing device 135 may control any one or more of these subsystems 145 by outputting signals to control units associated with these subsystems 145. The processing device 135 may control the subsystems 145 based, at least in part, on signals generated by the sensors 130. Moreover, the processing device 135 may be configured to determine the orientation of the seats 105 in the vehicle 100. For instance, the processing device 135 may be configured to determine which, if any, of the seats 105 are in a front-facing position and which, if any, of the seats 105 are in a rear-facing position based on signals received from seat sensors 150.

Whether operating in the autonomous mode or the non-autonomous mode, the processing device 135 may be configured to control the operation of certain restraint devices 110. For instance, based on signals received from the sensor 130, which may include the communication module 140, the processing device 135 may determine a possible or likely principal direction of impact or force that would be applied to the vehicle 100 if, e.g., the potential collision object were to collide with the vehicle 100. The processing device 135 may be configured to suppress deployment of at least one of the restraint devices 110 based on, e.g., the position signal received from the sensor 130.

Moreover or as an alternative approach, the processing device 135 may be configured to suppress deployment of at least one of the restraint devices 110 based on, e.g., the configuration of the seats 105. Thus, some restraint devices 110, such as the first restraint devices 110A, may only be deployed following a collision with the seats 105 in the front-facing position while other restraint devices 110, such as the second restraint device 110B, may only be deployed following a collision with the seats 105 in the rear-facing position.

In general, computing systems and/or devices, such as the user interface device 125 and the processing device 135, may employ any of a number of computer operating systems, including, but by no means limited to, versions and/or varieties of the Ford Sync® operating system, the Microsoft Windows® operating system, the Unix operating system (e.g., the Solaris® operating system distributed by Oracle Corporation of Redwood Shores, Calif.), the AIX UNIX operating system distributed by International Business Machines of Armonk, N.Y., the Linux operating system, the Mac OS X and iOS operating systems distributed by Apple Inc. of Cupertino, Calif., the BlackBerry OS distributed by Research In Motion of Waterloo, Canada, and the Android operating system developed by the Open Handset Alliance. Examples of computing devices include, without limitation, an on-board vehicle 100 computer, a computer workstation, a server, a desktop, notebook, laptop, or handheld computer, or some other computing system and/or device.

Computing devices generally include computer-executable instructions, where the instructions may be executable by one or more computing devices such as those listed above. Computer-executable instructions may be compiled or interpreted from computer programs created using a variety of programming languages and/or technologies, including, without limitation, and either alone or in combination, Java™, C, C++, Visual Basic, Java Script, Perl, etc. In general, a processor (e.g., a microprocessor) receives instructions, e.g., from a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described herein. Such instructions and other data may be stored and transmitted using a variety of computer-readable media.

A computer-readable medium (also referred to as a processor-readable medium) includes any non-transitory (e.g., tangible) medium that participates in providing data (e.g., instructions) that may be read by a computer (e.g., by a processor of a computer). Such a medium may take many forms, including, but not limited to, non-volatile media and volatile media. Non-volatile media may include, for example, optical or magnetic disks and other persistent memory. Volatile media may include, for example, dynamic random access memory (DRAM), which typically constitutes a main memory. Such instructions may be transmitted by one or more transmission media, including coaxial cables, copper wire and fiber optics, including the wires that comprise a system bus coupled to a processor of a computer. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD, any other optical medium, punch cards, paper tape, any other physical medium with patterns of holes, a RAM, a PROM, an EPROM, a FLASH-EEPROM, any other memory chip or cartridge, or any other medium from which a computer can read.

In some examples, system elements may be implemented as computer-readable instructions (e.g., software) on one or more computing devices (e.g., servers, personal computers, etc.), stored on computer readable media associated therewith (e.g., disks, memories, etc.). A computer program product may comprise such instructions stored on computer readable media for carrying out the functions described herein.

FIG. 3 is a flowchart of an exemplary process 300 that may be implemented by one or more components of the system 120 of FIG. 2. For example, the process 300 may be executed by the processing device 135 while the vehicle 100 is running or until deactivated by a user.

At decision block 305, the processing device 135 may determine whether the system 120 has been activated. The system 120 may be activated by, e.g., default or in response to a user input received via the user input device. Moreover, the system 120 may be deactivated in response to a user input. If the system 120 has been activated, the process 300 may continue at block 310. If the system 120 has been deactivated, the process 300 may end.

At block 310, the processing device 135 may determine the seat 105 configuration. The seat 105 configuration may be determined from signals received from a controller, which may be the processing device 135, configured to control the direction of the seats 105. Alternatively or in addition, the seat 105 configuration may be determined from a seat sensor 150. The seat 105 configuration may indicate that one or more seats 105 are in a front-facing position and one or more seats 105 are in a rear-facing position.

At decision block 315, the processing device 135 may determine whether a potential collision object has been detected. Whether a potential collision object has been detected may be determined from, e.g., one or more signals output by the sensors 130. Thus, determining whether a potential collision object has been detected may include receiving a position signal that indicates the position of one or more potential collision objects. The position signal may indicate the distance and/or direction of the potential collision object relative to the vehicle 100. In some implementations, the position signal may be received via a communication signal transmitted to the communication module 140. If a potential collision object is detected, the process 300 may continue at block 325. If no potential collision objects are detected, the process 300 may continue at decision block 320.

At decision block 320, the processing device 135 may determine whether the seat 105 configuration has changed based on, e.g., signals received from a seat sensor 150. If the seat 105 configuration has changed, the process 300 may continue at block 310. If the seat 105 configuration has not changed, the process 300 may continue at decision block 315.

At block 325, the processing device 135 may determine the position of the potential collision object. The position of the potential collision object may include a distance and/or direction of the potential collision object relative to the vehicle 100. The position of the potential collision object may indicate a principal direction of force that would be applied to the vehicle 100 if the potential collision object were to collide with the vehicle 100.

At decision block 330, the processing device 135 may determine whether to suppress one or more restraint devices 110 based on, e.g., the position signal and the seat 105 configuration. If the processing device 135 determines that one or more restraint devices 110 should be suppressed, the process 300 may continue at block 335. If no restraint devices 110 are to be suppressed, the process 300 may continue at block 320.

At block 335, the processing device 135 may suppress deployment of one or more restraint devices 110 based on the position signal and the seat 105 configuration. Therefore, the processing device 135 may enable the restraint devices 110 most likely to provide the greatest protection if the potential collision object were to collide with the vehicle 100. Suppressing the deployment of one or more restraint devices 110 may include, for example, disabling one or both of the first restraint devices 110A when the front seats 105A are in a rear-facing position while a second restraint device 110B remains enabled. The process 300 may continue at block 320 to determine whether the seat 105 configuration has changed.

With regard to the processes, systems, methods, heuristics, etc. described herein, it should be understood that, although the steps of such processes, etc. have been described as occurring according to a certain ordered sequence, such processes could be practiced with the described steps performed in an order other than the order described herein. It further should be understood that certain steps could be performed simultaneously, that other steps could be added, or that certain steps described herein could be omitted. In other words, the descriptions of processes herein are provided for the purpose of illustrating certain embodiments, and should in no way be construed so as to limit the claims.

Accordingly, it is to be understood that the above description is intended to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should be determined, not with reference to the above description, but should instead be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled. It is anticipated and intended that future developments will occur in the technologies discussed herein, and that the disclosed systems and methods will be incorporated into such future embodiments. In sum, it should be understood that the application is capable of modification and variation.

All terms used in the claims are intended to be given their broadest reasonable constructions and their ordinary meanings as understood by those knowledgeable in the technologies described herein unless an explicit indication to the contrary is made herein. In particular, use of the singular articles such as “a,” “the,” “said,” etc. should be read to recite one or more of the indicated elements unless a claim recites an explicit limitation to the contrary.

The Abstract of the Disclosure is provided to allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in various embodiments for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Spahn, Brian Robert, Tellis, Levasseur, Miller, Alex Maurice, Zwicky, Timothy D., Engelman, Gerald H., Stephenson, Richard Lee

Patent Priority Assignee Title
10011194, Mar 08 2017 Lear Corporation System and method for positioning a vehicle seat
10625699, Mar 31 2018 Arriver Software LLC Enhanced occupant seating inputs to occupant protection control system for the future car
10726310, Mar 31 2018 Arriver Software LLC Deployment zone definition and associated restraint control
11279307, Aug 26 2019 Lear Corporation Seat arrangement with dynamic seat positioning system
11377057, Oct 15 2019 Autoliv ASP, Inc. Seat location identification system for vehicles with removable seats
Patent Priority Assignee Title
5893582, Feb 21 1997 Key Safety Systems, Inc Airbag deployment control system
6236035, Nov 25 1998 TRW Inc Apparatus and method for sensing with triangulation and time-of-flight for use in an actuatable occupant restraint system
6302438, Apr 20 1998 Joyson Safety Systems Acquisition LLC Occupant detection system
6323487, Jan 26 1999 Aptiv Technologies Limited IR optical position sensor system
7976060, Jun 07 1995 AMERICAN VEHICULAR SCIENCES LLC Seat load or displacement measuring system for occupant restraint system control
8041469, Jan 05 2005 GM Global Technology Operations LLC Determining relative spatial information between vehicles
20020166710,
20070267848,
DE102010054743,
FR2943961,
GB2425720,
///////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Mar 11 2014STEPHENSON, RICHARD LEEFord Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 11 2014ZWICKY, TIMOTHY D Ford Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 13 2014MILLER, ALEX MAURICEFord Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 13 2014TELLIS, LEVASSEURFord Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 14 2014ENGELMAN, GERALD H Ford Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 14 2014SPAHN, BRIAN ROBERTFord Global Technologies, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0324520697 pdf
Mar 17 2014Ford Global Technologies, LLC(assignment on the face of the patent)
Date Maintenance Fee Events
Jul 16 2019M1551: Payment of Maintenance Fee, 4th Year, Large Entity.
Jul 13 2023M1552: Payment of Maintenance Fee, 8th Year, Large Entity.


Date Maintenance Schedule
Feb 23 20194 years fee payment window open
Aug 23 20196 months grace period start (w surcharge)
Feb 23 2020patent expiry (for year 4)
Feb 23 20222 years to revive unintentionally abandoned end. (for year 4)
Feb 23 20238 years fee payment window open
Aug 23 20236 months grace period start (w surcharge)
Feb 23 2024patent expiry (for year 8)
Feb 23 20262 years to revive unintentionally abandoned end. (for year 8)
Feb 23 202712 years fee payment window open
Aug 23 20276 months grace period start (w surcharge)
Feb 23 2028patent expiry (for year 12)
Feb 23 20302 years to revive unintentionally abandoned end. (for year 12)