A facial information detecting unit detects facial information related to facial features of a driver of a vehicle. A physical information detecting unit detects physical information related to physical features of the driver. A driving capability estimating unit estimates a driving capability of the driver on the basis of the facial information and the physical information. A facial expression estimating unit estimates the facial expression of the driver on the basis of the facial information. An emotion estimating unit estimates the emotion of the driver represented by the degree of comfort and the degree of activeness on the basis of temporal changes in the driving capability of the driver and the facial expression of the driver.
|
1. An emotion estimation device comprising processing circuitry
to detect facial information related to a facial feature of a driver of a vehicle,
to detect physical information related to a physical feature of the driver,
to estimate driving capability of the driver on a basis of the facial information and the physical information,
to estimate a facial expression of the driver as one from among a positive facial expression, a negative facial expression, and a neutral facial expression on a basis of the facial information, and
to estimate an emotion of the driver represented by a degree of comfort and a degree of activeness on a basis of a temporal change in the driving capability of the driver and the facial expression of the driver.
8. An emotion estimation method performed by processing circuitry comprising:
detecting facial information related to a facial feature of a driver of a vehicle;
detecting physical information related to a physical feature of the driver;
estimating driving capability of the driver on a basis of the facial information and the physical information;
estimating a facial expression of the driver as one from among a positive facial expression, a negative facial expression, and a neutral facial expression on a basis of the facial information; and
estimating an emotion of the driver represented by a degree of comfort and a degree of activeness on a basis of a temporal change in the facial expression of the driver in a case where it is estimated that the driver is capable of driving.
2. The emotion estimation device according to
3. The emotion estimation device according to
4. The emotion estimation device according to
5. The emotion estimation device according to
assigning a first number value to the driving capability of the driver; assigning a second number value to the facial expression of the driver;
gradually increasing or decreasing at least one of the first number value and the second number value based on the temporal change in the driving capability of the driver and the temporal change in the facial expression of the driver; and
estimating the emotion of the driver based on the first number value and the second number value.
6. The emotion estimation device according to
7. The emotion estimation device according to
|
This application is a National Stage of International Application No. PCT/JP2019/003854 filed Feb. 4, 2019.
The present invention relates to an emotion estimation device and an emotion estimation method for estimating an emotion of a driver.
Conventional emotion estimation devices detect physiological data and non-physiological data of a subject from, for example, an image captured by a camera, estimates the degree of awakening and the degree of comfort of the subject on the basis of the physiological data and the non-physiological data that have been detected, and selects an emotion that corresponds to the degree of awakening and the degree of comfort that have been estimated (see, for example, Patent Literature 1).
Patent Literature 1: JP 2017-144222 A
The emotions estimated by a conventional emotion estimation device are useful for predicting an action of a driver such as overspeed and road rage that may cause a traffic accident, thereby adjusting acceleration or deceleration by a driving assistance device, warning the driver, etc. However, since emotions have a duration, there is a disadvantage that emotions cannot be estimated correctly only by the degree of awakening and the degree of comfort.
The present invention has been made to solve the above-mentioned disadvantage, and an object of the present invention is to estimate emotions in consideration of temporal changes.
An emotion estimation device according to the present invention includes processing circuitry to detect facial information related to a facial feature of a driver of a vehicle, to detect physical information related to a physical feature of the driver; to estimate driving capability of the driver on a basis of the facial information and the physical information, to estimate a facial expression of the driver on a basis of the facial information, and to estimate an emotion of the driver represented by a degree of comfort and a degree of activeness on a basis of a temporal change in the driving capability of the driver and the facial expression of the driver.
According to the present invention, since the emotions are estimated on the basis of temporal changes in the driving capability of a driver and in the facial expression of the driver, an emotion can be estimated in consideration of the temporal changes.
In order to describe the present invention further in detail, embodiments for carrying out the present invention will be described below with reference to the accompanying drawings.
An imaging unit 2 is connected to the emotion estimation device 1. The imaging unit 2 is, for example, a camera installed near the steering wheel for capturing an image of the driver. The imaging unit 2 outputs a captured image capturing the driver to the emotion estimation device 1.
The emotion estimation device 1 includes a face detecting unit 11, a face parts detecting unit 12, a facial information detecting unit 13, a physical information detecting unit 14, a driving capability estimating unit 15, a facial expression estimating unit 16, and an emotion estimating unit 17.
The face detecting unit 11 acquires a captured image from the imaging unit 2. The face detecting unit 11 detects an area in which the driver's face is captured from the acquired captured image. The face detecting unit 11 outputs the face area of the captured image to the face parts detecting unit 12 and the physical information detecting unit 14.
The face parts detecting unit 12 detects face parts of the driver from the face area in the captured image detected by the face detecting unit 11. The face parts include the eyes, the nose, and the mouth. The face parts detecting unit 12 outputs the detected face parts to at least one of the facial information detecting unit 13 or the physical information detecting unit 14.
The functions of the face detecting unit 11 and the face parts detecting unit 12 are not necessarily included in the emotion estimation device 1 and may be included in an external device such as the imaging unit 2.
The facial information detecting unit 13 detects facial information related to the facial features of the driver on the basis of the face parts detected by the face parts detecting unit 12. The facial information includes at least one of the face orientation angle, the eye opening degree, the blinking speed, the mouth opening degree, the line-of-sight angle, or the head position of the driver. The facial information detecting unit 13 outputs the detected facial information to the driving capability estimating unit 15 and the facial expression estimating unit 16.
The physical information detecting unit 14 detects physical information related to the physical features of the driver on the basis of the face area detected by the face detecting unit 11 or the face parts detected by the face parts detecting unit 12. The physical information includes at least one of the heart rate, the heart rate variability, the heart rate interval (RRI), the brain wave, the pulse wave, the pulse rate variability, the blood pressure, the body temperature, or the sweat rate. Note that the physical information detecting unit 14 may detect the physical information using the face area or the face parts detected from the captured image of the imaging unit 2 or may detect the physical information using a detection result of a sensor (not illustrated). The physical information detecting unit 14 outputs the detected physical information to the driving capability estimating unit 15.
Note that the facial information is, of information related to biological functions, items regarded as facial features in physiology such as the face orientation angle, the eye opening degree, the blinking speed, the mouth opening degree, the line-of-sight angle, and the head position. The physical information is, of information related to biological functions, items regarded as physical features in physiology such as the heart rate, the heart rate variability, the RRI, the brain wave, the pulse wave, the pulse rate variability, the blood pressure, the body temperature, and the sweat rate.
The driving capability estimating unit 15 estimates the driving capability of the driver on the basis of at least one of the facial information detected by the facial information detecting unit 13 and the physical information detected by the physical information detecting unit 14. For example, the driving capability estimating unit 15 estimates a driving capability that corresponds to the facial information detected by the facial information detecting unit 13 and the physical information detected by the physical information detecting unit 14 using a model in which a correspondence relation among facial information and physical information of many and unspecified people in normal times and abnormal times and the driving capability are learned. Note that the driving capability estimating unit 15 may optimize the model for a driver of the host vehicle using facial information and physical information of the driver of the host vehicle. The driving capability has, for example, two levels of being capable and incapable. In a case of being capable of driving, the driver is in a condition suitable for driving, whereas in a case of being incapable of driving, the driver is in a condition unsuitable for driving or cannot drive. The driving capability estimating unit 15 outputs the driving capability that has been estimated to the emotion estimating unit 17.
The facial expression estimating unit 16 estimates the facial expression of the driver on the basis of the facial information detected by the facial information detecting unit 13. For example, the facial expression estimating unit 16 estimates a facial expression that corresponds to the facial information detected by the facial information detecting unit 13 on the basis of a predetermined correspondence relation between the facial information and the facial expression (so-called rule base). Alternatively, the facial expression estimating unit 16 may estimate a facial expression that corresponds to the facial information detected by the facial information detecting unit 13 using a model in which the correspondence relation between the facial information and the facial expression of many and unspecified people is learned (so-called machine learning). Further alternatively, the facial expression estimating unit 16 may estimate a facial expression by performing both the rule base and the machine learning. As for the facial expression, three types of facial expressions, for example, positive (smile), negative (glumly or crying), and neutral (other facial expressions) are set. The facial expression estimating unit 16 outputs the facial expression that has been estimated to the emotion estimating unit 17.
The emotion estimating unit 17 estimates the emotion of the driver represented by the degree of comfort and the degree of activeness on the basis of temporal changes in the driving capability estimated by the driving capability estimating unit 15 and temporal changes in the facial expression estimated by the facial expression estimating unit 16. Note that the “active” state in which the degree of activeness is high is a stale in which the driver is awake or sober, and the “inactive” state in which the degree of activeness is low is a stale in which the driver is sleeping or drunk.
In a case where the driver is incapable of driving, emotions are unlikely to be expressed in a facial expression, and it is difficult to estimate an emotion from the facial expression. Therefore, in a case where it is estimated by the driving capability estimating unit 15 that the driver is incapable of driving, the emotion estimating unit 17 determines the degree of activeness as being inactive.
On the other hand, in a case where it is estimated by the driving capability estimating unit 15 that the driver is capable of driving, the emotion estimating unit 17 determines the degree of activeness as being active. In addition, in a case where it is estimated by the driving capability estimating unit 15 that the driver is capable of driving, the emotion estimating unit 17 determines the degree of comfort as being comfortable when the facial expression estimated by the facial expression estimating unit 16 is positive, determines as being expressionless when the facial expression is neutral, and determines as being uncomfortable when the facial expression is negative.
In the following, an example of estimating emotions using the Russell's circumplex model will be described.
Moreover, in a case where an emotion-expressing point is plotted in an active and comfortable area 31 in the graph of
Next, a specific example of emotion estimation will be described.
In
In a case where the driving capability of the driver becomes “capable”, the emotion estimating unit 17 moves the emotion-expressing point 52 to the active side by the amount that corresponds to the duration of the driving capability of “capable”. When the emotion-expressing point 52 moves to an emotion-expressing point 53 and moves away from the area 33 as illustrated in
Next, the operation of the emotion estimation device 1 will be described.
In step ST1, the face detecting unit 11 detects the face area of the driver from the captured image of the imaging unit 2, in step ST2, the face parts detecting unit 12 detects face parts from the face area in the captured image. Note that, as described above, the functions of the face detecting unit 11 and the face parts detecting unit 12 may be included in an external device. In such a configuration, the emotion estimation device 1 does not perform the operation in step ST1 nor ST2.
In step ST3, the facial information detecting unit 13 detects facial information related to the facial features of the driver on the basis of the face parts. In step ST4, the physical information detecting unit 14 detects physical information related to the physical features of the driver on the basis of the face area or the face parts in the captured image. In step ST5, the driving capability estimating unit 15 estimates the driving capability of the driver on the basis of the facial information and the physical information. In step ST6, the facial expression estimating unit 16 estimates the facial expression of the driver on the basis of the facial information. In step ST7, the emotion estimating unit 17 estimates an emotion represented by the degree of comfort and the degree of activeness on the basis of temporal changes in the driving capability and the facial expression of the driver.
The operation of the emotion estimation device 1 illustrated in the flowchart of
As described above, the emotion estimation device 1 according to the first embodiment includes the facial information detecting unit 13, the physical information detecting unit 14, the driving capability estimating unit 15, the facial expression estimating unit 16, and the emotion estimating unit 17. The facial information detecting unit 13 detects facial information related to facial features of a driver of the vehicle. The physical information detecting unit 14 detects physical information related to physical features of the driver. The driving capability estimating unit 15 estimates the driving capability of the driver on the basis of the facial information and the physical information. The facial expression estimating unit 16 estimates the facial expression of the driver on the basis of the facial information. The emotion estimating unit 17 estimates the emotion of the driver represented by the degree of comfort and the degree of activeness on the basis of temporal changes in the driving capability of the driver and the facial expression of the driver. With this configuration, the emotion estimation device 1 is not only capable of estimating the emotion of the driver at a certain point in time but also estimating an emotion in consideration of temporal changes such as that the driver is gradually getting excited or gradually becoming calmed down.
The emotion estimating unit 17 of the first embodiment estimates the emotion of the driver on the basis of a duration of the facial expression of the driver in a case where it is estimated by the driving capability estimating unit 15 that the driver is capable of driving. As a result, the emotion estimating unit 17 can accurately estimate the emotion even when the driver's state is such that the emotion is hard to appear in the facial expression.
Furthermore, the emotion estimating unit 17 of the first embodiment determines the degree of comfort and the degree of activeness depending on the duration of the driving capability of the driver and the duration of the facial expression of the driver and estimates the emotion of the driver that corresponds to the degree of comfort and the degree of activeness that have been determined by referring to a predetermined correspondence relation among the degree of comfort, the degree of activeness, and emotions. As a result, the emotion estimating unit 17 can easily estimate an emotion in consideration of the temporal changes as illustrated in
The warning unit 18 includes at least one of a display or a speaker. The warning unit 18 warns the driver or prompts switching from manual driving to autonomous driving depending on the emotion estimated by the emotion estimating unit 17. For example, when the driver gradually gets excited and an emotion-expressing point enters the area 31 of
Note that the warning unit 18 may end the display of the warning screen when a certain period of time has passed after the warning screen has been displayed or may end the display of the warning screen when the driver becomes calmed down from a state of excitement or discontent or awakens from a state of drowsiness. Similarly, the warning unit 18 may end the output of the warning sound when a certain period of time has passed after the warning sound has been output or may end the output of the warning sound when the driver becomes calmed down from a state of excitement or discontent or awakens from a state of drowsiness.
Moreover, when the driver gradually feels drowsy and the emotion-expressing point enters the area 33 of
Furthermore, the emotion estimation device 1 may automatically perform switching from manual driving to autonomous driving when the driver feels drowsy. Specifically, the driving switching unit 19 instructs a vehicle control device (not illustrated) to switch from manual driving to autonomous driving when the driver gradually feels drowsy and the emotion-expressing point enters the area 33 of
As described above, the emotion estimation device 1 according to the second embodiment includes the warning unit 18. The warning unit 18 warns the driver or prompts switching to autonomous driving depending on the emotion of the driver estimated by the emotion estimating unit 17. As a result, the emotion estimation device 1 can warn at appropriate timing depending on the emotion of the driver.
Lastly, the hardware configuration of the emotion estimation devices 1 of the embodiments will be described.
In the case where the processing circuit is dedicated hardware as illustrated in
As illustrated in
Here, the processor 101 refers to, for example, a central processing unit (CPU), a processing device, an arithmetic device, or a microprocessor.
The memory 102 may be a nonvolatile or volatile semiconductor memory such as a random access memory (RAM), a read only memory (ROM), an erasable programmable ROM (EPROM), or a flash memory, a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a compact disc (CD) or a digital versatile disc (DVD).
Note that some of the functions of the face detecting unit 11, the face parts detecting unit 12, the facial information detecting unit 13, the physical information detecting unit 14, the driving capability estimating unit 15, the facial expression estimating unit 16, the emotion estimating unit 17, the warning unit 18, and the driving switching unit 19 may be implemented by dedicated hardware, and some may be implemented by software or firmware. In this manner, the processing circuit in the emotion estimation device 1 can implement the above functions by hardware, software, firmware, or a combination thereof.
Note that the present invention may include a flexible combination of the embodiments, a modification of any component of the embodiments, or omission of any component in the embodiments within the scope of the present invention.
An emotion estimation device according to the present invention is suitable for an emotion estimation device for estimating an emotion of a driver, a passenger, or the like who has boarded a mobile object including a vehicle, a train, a ship, or an aircraft.
1: emotion estimation device, 2: imaging unit, 11: face detecting unit, 12: face parts detecting unit, 13: facial information detecting unit, 14: physical information detecting unit, 15: driving capability estimating unit, 16: facial expression estimating unit, 17: emotion estimating unit, 18: warning unit, 19: driving switching unit, 31 to 33: area, 41 to 44, 51 to 53: emotion-expressing point, 100: processing circuit, 101: processor, 102: memory
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
7821409, | Mar 26 2007 | Denso Corporation | Drowsiness alarm apparatus and program |
20140114536, | |||
20140171752, | |||
20170105662, | |||
20170108864, | |||
20170150930, | |||
20190213429, | |||
JP201671577, | |||
JP2017100039, | |||
JP2017144222, | |||
WO2013008301, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 04 2019 | Mitsubishi Electric Corporation | (assignment on the face of the patent) | / | |||
Apr 23 2021 | TAKAMOTO, SHUSAKU | Mitsubishi Electric Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 056576 | /0729 |
Date | Maintenance Fee Events |
Jun 17 2021 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Aug 30 2025 | 4 years fee payment window open |
Mar 02 2026 | 6 months grace period start (w surcharge) |
Aug 30 2026 | patent expiry (for year 4) |
Aug 30 2028 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 30 2029 | 8 years fee payment window open |
Mar 02 2030 | 6 months grace period start (w surcharge) |
Aug 30 2030 | patent expiry (for year 8) |
Aug 30 2032 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 30 2033 | 12 years fee payment window open |
Mar 02 2034 | 6 months grace period start (w surcharge) |
Aug 30 2034 | patent expiry (for year 12) |
Aug 30 2036 | 2 years to revive unintentionally abandoned end. (for year 12) |