The human machine interface (HMI) of a vehicle, especially an automobile is responsive to outside stimuli. Signals relevant to the amount of concentration required by a driver are received by the HMI, which determines, based on the values of these signals, a driver distraction level. As the driver distraction level increases, areas of a structured display on a display screen are dynamically removed, to present the driver with a reduced distractions when greater concentration is required.
|
21. A human machine interface for a vehicle, the human machine interface comprising at least one display screen configured to simultaneously display a plurality of areas as a structured display; the human machine interface operable to determine a driver distraction level based on at least one signal received by the human machine interface and to remove predetermined areas from the plurality of areas of the structured display on the screen in response to an increase in the driver distraction level beyond a predetermined threshold; wherein the human machine interface is operable to remove areas from the structured display dynamically, such that predetermined areas gradually recede from the screen, appear to move off the edge of the plurality of areas of the structured display, or become covered by other areas of the screen.
1. An automobile comprising a human machine interface, the human machine interface comprising at least one display screen configured to simultaneously display a plurality of areas as a structured display; the human machine interface operable to determine a driver distraction level based on at least one signal received by the human machine interface and to remove predetermined areas from the plurality of areas of the structured display on the screen in response to an increase in the driver distraction level beyond a predetermined threshold; wherein the human machine interface is operable to remove areas from the structured display dynamically, such that predetermined areas gradually recede from the screen, appear to move off the edge of the plurality of areas of the structured display, or become covered by other areas of the screen.
2. The automobile comprising a human machine interface according to
3. The automobile comprising a human machine interface according to
4. The automobile comprising a human machine interface according to
5. The automobile comprising a human machine interface according to
6. The automobile comprising a human machine interface according to
7. The automobile comprising a human machine interface according to
8. The automobile comprising a human machine interface according to
9. The automobile comprising a human machine interface according to
10. The automobile comprising a human machine interface according to
11. The automobile comprising a human machine interface according to
12. The automobile comprising a human machine interface according to
13. The automobile comprising a human machine interface according to
14. The automobile comprising a human machine interface according to
15. The automobile comprising a human machine interface according to
16. The automobile comprising a human machine interface according to
17. The automobile comprising a human machine interface according to
18. The automobile comprising a human machine interface according to
19. The automobile comprising a human machine interface according to
20. A method of controlling a human machine interface in the automobile according to
a. receiving at least one signal relating to a factor relevant to the amount of concentration required of a driver;
b. calculating the driver distraction level based on the at least one signal;
c. determining whether the driver distraction level is above a predetermined threshold; and
d. if the driver distraction level is determined to exceed a predetermined threshold, controlling the display screen so as to dynamically remove predetermined areas from the structured display on the display screen.
|
The present application is a national phase entry under 35 U.S.C. § 371 of International Application No. PCT/GB2016/053740, filed Nov. 29, 2016, entitled “RESPONSIVE HUMAN MACHINE INTERFACE,” which designated, among the various States, the United States of America, and which claims priority to GB 1521360.6 filed Dec. 3, 2015, both of which are hereby incorporated by reference.
The present invention relates to the human machine interface (HMI) of a vehicle, especially, but not exclusively, an automobile. In particular the invention relates to a HMI which is responsive to outside stimuli.
It is now common for automobiles, and other vehicles, to have display screens through which a user interacts with the vehicle. Such display screens are navigable by way of various inputs, such as touch-screens, buttons, knobs, gestures or voice. Depending on the mode chosen, such screens may display different information, for example in relation to satellite navigation, video displays from cameras located about the vehicle, cell-phone use, movies, radio stations, climate control and so on. Without limitation, speed, RPM and other information may also be displayed on such screens. Usually such display screens are integral to the vehicle, but a display screen may be part of a portable mobile device, such as a phone or tablet, which is suitably connected to the vehicle such that inputs to the portable device control the vehicle and signals from the vehicle are transmitted to the portable device.
Some of these outputs and icons for potential inputs can create a distraction to a driver and naturally, when inputting information into the HMI, a driver's attention to the road is reduced. Accordingly, certain vehicles disable certain inputs in response to the state of the vehicle. In particular, it is common for satellite navigation systems to disable the option of entering a new destination when the vehicle is in motion (e.g. above 7 kph). This is normally done by “greying out” the input icon on the screen, to a slightly lighter colour than the other options, or when the icon is selected, failing to respond, or displaying a message that the icon cannot be selected when the vehicle is in motion.
Similarly, when an automobile is in motion, the movie/TV option is often disabled in the same fashion, i.e. it is “greyed out”, or when selected fails to respond or displays a message.
An object of embodiments of the invention is to provide an improved HMI.
According to a first aspect of the present invention, there is provided a human machine interface (HMI) for a vehicle, the HMI comprising at least one display screen configured to simultaneously display a plurality of areas as a structured display; the human machine interface operable to determine a driver distraction level based on at least one signal received by the HMI and to remove predetermined areas from the structured display on the screen in response to an increase in the driver distraction level beyond a predetermined threshold.
Such a HMI is advantageous in that a user (whose distraction level is already determined to be beyond a certain level) will not be confused by the appearance of an area, e.g. a greyed out selectable area which cannot be selected, nor presented with a message which could further distract him/her as the amount of concentration required increases.
The areas may be removed dynamically from the structured display, for example such that they appear to move off the edge of the visible area, or become covered by other areas of the screen. This dynamic motion, which may be a smooth movement, has the effect of acclimatising the driver to the loss of information/options so that he/she will not attempt to find the option/information in the area once it is no longer visible. This further avoids potential distraction, when concentration is required.
As areas are removed from the structured display, the size of the remaining areas may be increased. The increase in size of remaining areas makes the screen easier to read and use for a driver whose distraction level is beyond a predetermined threshold.
One or more of the plurality of areas may be selectable areas. Selectable areas may be removed from the screen.
At least one of the plurality of areas may comprise an image of a navigation instruction The size of the area comprising the navigation instruction may increase in response to an increase in the driver distraction level.
The driver distraction level may correspond to the speed of travel of the vehicle only.
The driver distraction level may be calculated based on factors instead of, or in addition to, the speed of the vehicle.
The driver distraction level may be calculated based on signals indicative of one or more of the following factors: speed, driver drowsiness, road condition, traffic conditions, such as traffic jams, navigation data, such as road curvature, upcoming signals, intersections, stops signs, upcoming manoeuvres, status of Autonomous Cruise Control (ACC), status of Automatic Emergency Braking system (AEB), status of automatic Lane Keeping System (LKS), and/or Lane Departure Warning system (LDW), telephone status (e.g. on/off/active call), radio/video status and/or volume.
Additional factors, such as the state of driving controls, for example, a gear lever or hand brake, may be used, and each factor may be weighted. For example, with the gear lever in neutral and the hand brake activated, the weighting may be such that the driver distraction level is set to zero regardless of factors such as driver drowsiness, road condition, traffic conditions, navigation data, upcoming maneuvers or the status of the various safety systems (on the basis that with the gears not engaged and the handbrake on, the car is not being driven).
Speed can be determined from signals provided to the speedometer; driver drowsiness can be determined by known means, such as disclosed in EP0999520; road condition, e.g. wetness can be determined from sensors, such as are used in automatic windscreen wiper systems, or based on determining the windscreen wiper speed; traffic conditions, navigation data and upcoming maneuvers can all be determined from data obtained by a satellite navigation system. The status of the various safety systems can be determined from the vehicle's central computer, or Engine Control Unit (ECU).
Driver distraction level can be calculated by assigning a value based on the level of one or more of the factors and applying a weighting to the factor.
In one particular embodiment of the invention, a human machine interface for a vehicle comprises a display screen configured to simultaneously display a plurality of areas and is operable to determine driver distraction level based on at least one signal received by the human machine interface and to dynamically remove areas from the screen in response to an increase in the driver distraction level; wherein the signal indicative of driver distraction level is indicative of the speed of the vehicle and the distance until the next navigation manoeuvre; the display screen being configured to simultaneously display an area comprising image of navigation instructions, an area comprising an image of a map and one or more selectable areas, wherein the human machine interface is operable such that as speed increases and as the distance until the next manoeuvre decreases past a first predetermined threshold, the display screen is operable to remove one or more selectable areas from the structured display on the screen, then as a second predetermined threshold is passed, the human machine interface is operable to increase the size of the area comprising navigation instructions, then as a third predetermined threshold is passed, the human machine interface is operable to remove the area comprising the image of the map from the screen such that only the image comprising a navigation instruction remains and its size is significantly increased.
The image of the map may only be removed completely above a certain predetermined threshold speed, e.g. at least 200 kph, or even at least 300 kph and within a certain predetermined distance to the next navigation manoeuvre, e.g. less than 1 km.
This is because generally speaking, drivers will have sufficient time to note navigation instructions and the addition of a map will assist in taking the correct turning. However, in high-speed driving, such as may be conducted in high-powered automobiles on roads without speed restriction it is critical to present as few distractions as possible to the driver as a navigation manoeuvre, e.g. taking an exit from the motorway, approaches.
The dynamic removal of images from the screen may be such that the image appears to move off the visible area of the screen.
In one particular embodiment, as speed gradually increases, so predetermined areas may gradually recede from the screen, for example, either fading away, or moving off the visible area of the screen.
Predetermined areas may be selectable areas and may gradually recede from the screen until at a certain predetermined speed they are no longer selectable.
Once certain first predetermined selectable areas have receded and are no longer selectable, second predetermined areas may recede (e.g. fading or moving off the visible area of the screen), until the speed reaches a second predetermined speed, at which the second predetermined areas are no longer selectable either.
Once the second predetermined selectable areas are on longer selectable, one or more third predetermined area may recede, until at a certain predetermined speed, only a predetermined minimum level of information is provided on the screen.
The extent to which the areas recede may be directly proportional to the speed of the vehicle, receding gently as speed increases gently, or faster if speed increases faster.
The areas may also return to the screen in the same fashion as speed reduces, gradually appearing, and becoming selectable at the same predetermined threshold speed at which they stopped being selectable.
In a particularly preferred embodiment, the selectable areas gradually move off the visible area of the screen and begin to fade at the moment that they are no longer selectable. Alternatively, the selectable areas may gradually fade and begin to move of the screen at the point that they are no longer selectable.
According to a second aspect of the invention there is provided a method of controlling a human machine interface (HMI) in an automobile; the method comprising: receiving at least one signal relating to a factor relevant to the amount of concentration required of a driver; calculating a driver distraction level based on the at least one signal; determining whether the driver distraction level is above a predetermined threshold; and if the driver distraction level is determined to exceed a predetermined threshold, controlling the display screen so as to dynamically remove predetermined areas from the structured display on the display screen.
Receiving at least one signal relevant to the concentration required of a driver may comprise receiving a plurality of signals relevant to the amount of concentration required.
The signals relevant to the concentration required may be signals indicative of one or more of the following factors: speed, driver drowsiness, road condition, traffic conditions, navigation data, upcoming maneuvers, status of Autonomous Cruise Control (ACC), status of Automatic Emergency Braking system (AEB), status of automatic Lane Keeping System (LKS), and/or Lane Departure Warning system (LDW), status of a Traffic Pilot, or Auto Pilot, telephone status (e.g. on/off/active call), radio/video status and/or volume.
Calculating the distraction level may comprise applying a weighting to the different signals received, in order to determine a total driver distraction level.
A plurality of thresholds may be provided and as each threshold is exceeded a greater number of predetermined areas may be dynamically removed from the structured display.
The method may include any of the features set out above in relation to the first aspect of the invention.
The present invention also extends to an automobile comprising a HMI as set out in the first aspect (including optional features) and an automobile adapted to carry out the method of the second aspect (including optional features).
In order that the invention may be more clearly understood embodiments thereof will now be described, by way of example only, with reference to the accompanying drawings, of which:
Referring to
The processor 5 is also able to receive and process signals from outside the HMI 2, which are received by a communications unit 14. In particular, the processor 5 is arranged to receive, from the communications unit 14 signals from a satellite navigation module 8, a vehicle speed sensor 9, a driver drowsiness estimating apparatus 10, a road condition sensor 11 (e.g. a sensor or combination of sensors arranged to measure wetness and temperature), and from the automobile's ECU 12.
Those skilled in the art will appreciate that the processor 5 need not receive all of these signals directly from the apparatuses 8, 9, 10, 11, 12 defined, for example, the ECU alone may provide signals indicative of the vehicle's speed, or road conditions, indeed, an indicative speed could even be included in the signal from the satellite navigation module 8. It will also be appreciated that receiving modules (not shown) etc. may be introduced to interface with the various external apparatuses 8-12 to receive and optionally format the signals before they are passed to the communications unit 14.
The processor 5 is arranged to request and subsequently process signals received from the various external apparatuses 8-12 and send them to the distraction level determination module 7 and in response to the distraction level determined by the distraction level module 7 and the input from the input device 4, to output a display to the display device.
Of course, the processor 5 is also responsible for processing and forwarding commands from the input device 4 to further external hardware 13 such as radio or climate control devices etc., via the communications unit 14 in the HMI 2 (which will also feed back signals to the processor concerning the devices it controls), and optionally via the automobile's ECU 12.
According to the method of an embodiment of the invention, the processor 5 receives input signals from the external apparatus 8-12 and the internal control unit 14 as set out in table 1 below:
TABLE 1
Satellite navigation module 8
Traffic conditions
Navigation data
Upcoming manoeuvres
Speed sensor 9
Vehicle speed
Drowsiness estimating
Driver drowsiness level
apparatus 10
Road condition sensor 11
Road condition
ECU 12
Autonomous Cruise Control (ACC) status
Automatic Emergency Braking system
(AEB) status
Automatic Lane Keeping System (LKS)
status
Lane Departure Warning system (LDW)
status
Traffic Pilot status
Auto Pilot status
HMI Control Unit 14
Telephone status (e.g. on/off/active call)
Radio status
Video status
Volume
It will be appreciated that the signals may be provided by other devices in the vehicle, for example, the satellite navigation module may be able to provide information on the status of the radio, and the ECU or instrument cluster may provide speed information.
The signals are processed and sent to the distraction level determination module 7, which determines based on the value of these signals, the distraction level.
Of course various methods may be used to weight the different results, to conditionally take certain values into account and so on, and such methods may be optimised with improved algorithms, but as an exemplary specific embodiment, distraction level determination module 7 determines the distraction level by assigning weighted values to the results of the input signals as shown in table 2 below:
TABLE 2
Traffic conditions
Clear = 0
Moderate = 5
Busy = 10
Navigation data
Straight road = 0
Moderate curves = 5
Very curvy = 10
Upcoming manoeuvres
None within 2 minutes = 0
Easy manoeuvre within 2 minutes = 1
Easy manoeuvre within 30 seconds = 5
Complex manoeuvre within 2 minutes = 5
Complex manoeuvre within 30 seconds =
10
Vehicle speed
0 Kph = −1000
0-20 kph = 0
20-50 kph = 5
50-90 kph = 10
90-130 kph = 20
130-200 kph = 50
200-300 kph = 90
300+ kph = 1000
Driver drowsiness level
Fully awake = 0
Moderately drowsy = 10
Drowsy = 30
Road condition
Dry = 0
Wet = 5
Foggy = 30
Potentially Icy = 30
Autonomous Cruise Control
On = 0
(ACC) status
Off = 5
Automatic Emergency
On = −5
Braking system (AEB) status
Off = 5
Automatic Lane Keeping
On = −10
System (LKS) status
Off = 5
Lane Departure Warning
On = −5
system (LDW) status
Off = 2
Time of day
day = 0
Dawn/dusk = 10
Night = 10
Telephone status (e.g.
Off = 0
off/active call/incoming
Active call = 10
call/interaction with device)
Incoming call = 20
Interaction = 30
Radio status
Inactive = 0
Channel browsing = 30
Video status
Off = 0
On = 20
Volume
Off = 0
Quiet = 2
Loud = 10
Once a value has been assigned to each factor, the distraction level determination module 7 sums the weighted values to produce a total distraction level, which is output to the processor to determine which images should be displayed on the screen and which to dynamically remove. This distraction level is constantly updated as new signals are received.
The value assigned to the distraction level, based on this is used to determine which images are shown on the display screen 3. Thus the memory 6 stores information concerning which images to display on the display screen 3 dependent on the value of the distraction level. In one example, the memory 3 indicates that that the display screen 3 should be fully operational, with all the selectable images displayed, provided the distraction level is 0 or lower. Hence, when the vehicle is stationary, because the value of −1000 is assigned to the speed, regardless of the value of any other signals, because the speed=0, the value of the distraction level will be below 0 and therefore the display screen 3 will be fully operational, with all selectable items shown on the display screen 3.
A first exemplary embodiment of a possible structured display, in which particular areas displaying particular information are arranged in particular places on a display screen 3, in accordance with the invention is described with reference to
In the example, the memory 6 stores a reference table indicating that where the distraction level is between 1 and 20, the high distraction selectable areas 18 should be removed from the screen, but the low distraction selectable areas 17 may remain.
Thus, for example, when the traffic conditions are clear (=0), the road is straight (=0), there are no upcoming maneuvers (=0), the vehicle is being driven at 40 kph (=5), by a driver who the apparatus 10 determines to be moderately drowsy (=10) on a road which the road condition sensor 11 indicates to be dry (=0), with the signal from the ECU 21 indicating that all the driver assistance systems (ACC, AEB, LKS, are turned on (=−15), and with the telephone status being off with no active call (=0), the radio inactive (=0) with volume at a relatively low level (=2) and the video off (=2), the distraction level determination module 7 will determine a distraction level of 5, the total of the values for each factor that is being monitored and will output that value to the processor 5, the processor 5 will determine based on a comparison with the reference table stored in the memory 6 that the value has passed the predetermined threshold of 1, at which the high distraction selectable areas 18 should be removed from the screen and will dynamically remove the high distraction selectable areas 18 from the screen.
This is done by altering the display signal sent from the processor 5 to the display screen 3, so as to show the central information display image 16 expanding laterally, to occupy an oval shaped central region as shown in
The processor 5 continuously, or periodically, sends updated signals to the distraction level determination module 7, to update the value of the distraction level and, if the level passes the second threshold of 20 and enters a second region stored in the reference table in the memory, of between 20 and 100, the display signal sent by the processor 5 is again modified to alter the image shown on the display screen 3.
Thus, for example, if the traffic conditions remain clear (=0), there are no upcoming manoeuvres (=0), the vehicle is being driven by a driver who the apparatus 10 determines to be moderately drowsy (=10) on a road which the road condition sensor 11 indicates to be dry (=0), with the signal from the ECU 21 indicating that all the driver assistance systems (ACC, AEB, LKS,) are turned on (=−15), (note that LKS and LDW are mutually exclusive) and with the telephone status being active call (=10), the radio inactive (=0) with volume at a relatively low level (=2) and the video off (=2), but the speed sensor determines that the speed is 140 kph (=30), the distraction level determination module 7 will determine a distraction level of 39, the total of the values for each factor that is being monitored.
Accordingly the distraction level determination module 7 will output that value to the processor 5, the processor 5 will determine based on a comparison with the reference table stored in the memory 6 that the value has passed the second predetermined threshold of 20, at which the low distraction selectable areas 17 should be removed from the display screen 3 and will dynamically remove the low distraction selectable areas 17 from the screen 3 to display the image shown in
In a further step, if the continuous/periodic monitoring of the signals that determine the distraction level continue to increase and pass a third threshold of 100, to enter a third region stored in the memory of between 100 and 2000, the processor 5 sends a signal to the display screen 3 modifying the image to show only a very basic image e.g. the next manoeuvre if satellite navigation is active, or no image at all. At this stage, the processor may send additional signals to the control unit 14 or ECU 12 to shut down any further potential distractions.
As will be appreciated, in this example, this final, third threshold cannot be exceeded, since the highest possible value for the distraction level would be 1207 and the circumstances where that total would be reached are unrealistic, involving a speed in excess of 300 kph on a busy road with a rapidly approaching complex manoeuvre, potentially icy weather and whilst interacting with the telephone and radio. Of course, the final threshold of 100 will always be reached when travelling at very high speed (in excess of 300 kph) and can be relatively easily reached if travelling at over 200 kph with additional potential distractions.
A second exemplary embodiment of a structured arrangement of images on a display screen is shown with reference to
In the second example, the memory 6 stores a reference table which correlates speed with distraction level. Thus, as shown in
Thus, as speed is output to the processor 5, the processor 5 will determine based on a comparison with the reference table stored in the memory 6 that the value has passed the predetermined threshold of 10 kph, at which the high distraction selectable areas 22 should no longer be selectable and should be removed from the screen and will dynamically remove the high distraction selectable areas 22 from the screen, in response to further increase in speed, by outputting an image signal which causes the high distraction selectable areas 22 to appear to fade and move downwards off the visible area 15 of the screen 3 as speed increases.
At the same time, with increasing speed, as shown in
The processor 5 continuously sends updated signals to the distraction level determination module 7, to update the value of the distraction level (i.e. in this example the speed) and, if the level passes the second threshold of 100 kph and enters a second region stored in the reference table in the memory, of between 100 kph and 200 kph, in which the display signal sent by the processor 5 is again modified to alter the image shown on the display screen 3.
Accordingly the distraction level determination module 7 will output a value to the processor 5, the processor 5 will determine based on a comparison with the reference table stored in the memory 6 that the value has passed the second predetermined threshold of 100 kph, at which the low distraction selectable areas 21 should no longer react to being selected and should also be removed from the structured display on the display screen 3 and again, in accordance with the graph shown in
In a further step, if the continuous monitoring of the signal that determines the distraction level (i.e. speed) continues to increase and the value passes a third threshold of 200 kph, to enter a third region stored in the memory of between 200 and 500 kph, the processor 5 sends a signal to the display screen 3 modifying the image to remove the map image 20 and expand the navigation information image 19 to occupy substantially the entire visible area on the display screen 3 as shown in
The above embodiments are described by way of example only. Many variations are possible without departing from the scope of the invention as defined in the appended claims.
Stevens, Richard, Recktenwald, Benedict
Patent | Priority | Assignee | Title |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 09 2015 | RECKTENWALD, BENEDICT | Bentley Motors Limited | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 045735 | /0192 | |
Nov 29 2016 | Bentley Motors Limited | (assignment on the face of the patent) | / | |||
Mar 29 2018 | STEVENS, RICHARD | Bentley Motors Limited | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 045735 | /0192 |
Date | Maintenance Fee Events |
May 07 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
May 16 2024 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Dec 08 2023 | 4 years fee payment window open |
Jun 08 2024 | 6 months grace period start (w surcharge) |
Dec 08 2024 | patent expiry (for year 4) |
Dec 08 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 08 2027 | 8 years fee payment window open |
Jun 08 2028 | 6 months grace period start (w surcharge) |
Dec 08 2028 | patent expiry (for year 8) |
Dec 08 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 08 2031 | 12 years fee payment window open |
Jun 08 2032 | 6 months grace period start (w surcharge) |
Dec 08 2032 | patent expiry (for year 12) |
Dec 08 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |