A live facial recognition method and system includes capturing a visual image and obtaining temperature information of a person under recognition, and deriving facial features according to the visual image; comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value.
|
8. A live facial recognition method, comprising:
capturing a visual image and obtaining temperature information of a person under recognition, facial features being derived according to the visual image;
comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; and
determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
normalizing a temperature information to catch a recognition area in the temperature information;
dividing the recognition area into a plurality of sub-areas;
selecting a feature area in the temperature information, the feature area including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
17. A live facial recognition system, comprising:
a processor;
an image capture device that is controlled by the processor to capture a visual image of a person under recognition, facial features being derived according to the visual image;
a temperature detection device that is controlled by the processor to obtain temperature information of the person under recognition; and
a storage device that stores a facial feature database;
wherein the processor compares the facial features of the visual image with corresponding facial features of the facial feature database to obtain a difference therebetween; and the processor determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
normalizing a temperature information to catch a recognition area in the temperature information;
dividing the recognition area into a plurality of sub-areas;
selecting a feature area in the temperature information, the feature area including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
1. A live facial recognition method, comprising:
capturing a visual image and obtaining temperature information of a person under recognition, facial features being derived according to the visual image;
comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; and
determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
performing mapping between the visual image and the temperature information;
normalizing the visual image to catch a recognition area in the visual image;
catching a corresponding recognition area in the temperature information according to mapping relationship between the visual image and the temperature information;
dividing the recognition area of the temperature information into a plurality of sub-areas;
selecting a feature area in the visual image and a corresponding feature area in the temperature information, the feature area of the temperature information including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
10. A live facial recognition system, comprising:
a processor;
an image capture device that is controlled by the processor to capture a visual image of a person under recognition, facial features being derived according to the visual image;
a temperature detection device that is controlled by the processor to obtain temperature information of the person under recognition; and
a storage device that stores a facial feature database;
wherein the processor compares the facial features of the visual image with corresponding facial features of the facial feature database to obtain a difference therebetween; and the processor determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
performing mapping between the visual image and the temperature information;
normalizing the visual image to catch a recognition area in the visual image;
catching a corresponding recognition area in the temperature information according to mapping relationship between the visual image and the temperature information;
dividing the recognition area of the temperature information into a plurality of sub-areas;
selecting a feature area in the visual image and a corresponding feature area in the temperature information, the feature area of the temperature information including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
2. The method of
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
capturing an image of a person under test, resulting in a captured image;
performing face detection on the captured image;
catching a face image from the captured image according to results of the face detection;
extracting facial features from the face image;
numericalizing the facial features to generate facial feature values; and
building a model according to the facial feature values, thereby generating the facial feature database.
9. The method of
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
11. The system of
12. The system of
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
13. The system of
14. The system of
15. The system of
16. The system of
18. The system of
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
|
This application claims priority of Taiwan Patent Application No. 106135308, filed on Oct. 16, 2017, the entire contents of which are herein expressly incorporated by reference.
1. Field of the Invention
The present invention generally relates to facial recognition, and more particularly to a live facial recognition method and system.
2. Description of Related Art
Facial recognition is computer image processing capable of identifying facial features from a digital image or a video frame, and could be used as a security measure. Facial recognition is one of biometrics such as fingerprint or eye iris recognition. Facial recognition may be adapted to electronic devices such as computers, mobile phones and card readers. Particularly, as mobile devices are becoming more popular, the security measure is in high demand.
A conventional facial recognition system uses a two-dimensional (2D) camera to capture an image, from which facial features are extracted and compared with a database. However, the conventional facial recognition system usually cannot distinguish a real person from a picture while performing recognition, becoming a security loophole to be exploited.
In order to enhance reliability of the security measure, a facial recognition system is proposed to ask a user to act according to a given instruction such as swinging or rotating head, opening mouth or closing eyes. Further, some images may be captured while the user is acting on instruction, and accordingly depth information may be obtained and used to identify a real person. Nevertheless, those schemes take time and cause inconvenient.
A need has thus arisen to propose a novel facial recognition scheme capable of maintaining or enhancing reliability of the security measure, and accelerating facial recognition with convenience.
In view of the foregoing, it is an object of the embodiment of the present invention to provide a live facial recognition method and system capable of quick recognizing a face accurately and conveniently.
According to one embodiment, a visual image and temperature information of a person under recognition are captured, and facial features are derived according to the visual image. The facial features of the visual image are compared with corresponding facial features of a facial feature database to obtain a difference therebetween. If the difference is less than a predetermined threshold value, it determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information.
In the embodiment, the temperature detection device 13, controlled by the processor 11, may be configured to detect temperature in a specific area in order to obtain temperature information. The temperature detection device 13 may include a temperature sensor array (e.g., an infrared temperature sensor array) that is capable of obtaining a temperature image representing the temperature information of an entire face. The temperature sensor array is also called a temperature camera (e.g., an infrared camera) in this specification. Alternatively, the temperature detection device 13 may include a single temperature sensor (e.g., an infrared temperature sensor or an infrared thermometer) that is capable of detecting plural temperature data representing the temperature information at specific points of a face. The single temperature sensor may adopt remote sensing to detect one point at a time. In one embodiment, the temperature detection device 13 is embedded in the image capture device 12 (i.e., the image capture device 12 and the temperature detection device 13 are integrated), and therefore the temperature information and the visual image can be obtained at the same time.
In the embodiment, the memory unit 14 may be configured to store a computer program and data for executing the computer program, by which the processor 11 may perform live facial recognition. The memory unit 14 may include a random access memory such as a dynamic random access memory (DRAM), a static random access memory (SRAM) or other memory units suitable for storing a computer program. The storage device 15 of the embodiment may be configured to store a database adaptable to live facial recognition. The storage device 15 may include a hard disk, a solid state disk (SSD) or other storage devices suitable for storing a database.
Prior to live facial recognition, a facial feature database should be generated through registry and modeling. Afterwards, live facial recognition may be performed by comparing the facial features of a person under recognition with the facial feature database.
In step 25, the processor 11 may numericalize the facial features to generate facial feature values. Next, in step 26, a model is built according to the facial feature values, and a facial feature database is accordingly generated and stored in the storage device 15 (step 27).
In step 32, the temperature detection device 13 (in the embodiment, a temperature camera such as an infrared camera) may capture a temperature image (e.g., an infrared image) of the person under recognition. In the embodiment, the temperature detection device 13 is separate from the image capture device 12, and the temperature image may represent the temperature information. In general, step 32 may be performed at any time prior to step 35. The information obtained by the temperature camera (e.g., the infrared camera) may be transformed into the temperature image. For example, intensity of an infrared detection value may be transformed into a corresponding temperature.
In step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36).
On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 35.
According to one aspect of the embodiment, in step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37).
Subsequently, in step 352, the visual image may be normalized to catch a recognition area. In one example, a recognition area (e.g., a rectangular area) that substantially covers a facial contour may be caught in the visual image. For example, a rectangular recognition area is defined by two horizontal lines respectively passing top facial contour edge and bottom facial contour edge and two vertical lines respectively passing right facial contour edge and left facial contour edge. In another example, a recognition area is caught by removing pixels with temperature being lower than an average temperature in the temperature image. If it determines that the person under recognition wears a mask (which causes temperature below the nose to be lower than temperature at the nose) or wears glasses (which causes temperature at the eyes to be lower than temperature at the nose), the corresponding area may also be removed.
In step 353, a corresponding recognition area is caught in the temperature image according to the mapping relationship between the visual image and the temperature image. In step 354, the recognition area of the temperature image is divided into sub-areas (e.g., 5×5 sub-areas). In another example, sub-areas may be obtained according to feature points shown in
In step 355, a feature area (that includes at least one sub-area) is selected in the visual image, and a corresponding feature area is then obtained in the temperature image. For example, a feature area of the temperature image that includes at least one sub-area covering a face organ (e.g., the nose) is obtained. Subsequently, in step 356, it compares temperature of the feature area of the temperature image with temperature of at least one neighboring sub-area. The at least one neighboring sub-area may include a top neighboring sub-area and a bottom neighboring sub-area; a top neighboring sub-area, a bottom neighboring sub-area, a left neighboring sub-area and a right neighboring sub-area; three top neighboring sub-areas and three bottom neighboring sub-areas; or eight neighboring sub-areas along eight directions respectively. In this specification, temperature of an area may refer to a mean temperature or a medium temperature of the area, or may refer to a mean value, a maximum value or a minimum value of some intermediate temperatures.
Finally, in step 357, if comparing result (of step 356) conforms to predetermined (or normal) facial temperature distribution, the live facial recognition criterion is asserted or conformed. In this specification, the predetermined facial temperature distribution may refer to a mean value or a medium value among plural people, or may refer to a mean value of some intermediate values among plural people. Take the feature area covering the nose as an example, normal facial temperature distribution is conformed if temperature of the feature area is lower than temperature of neighboring sub-area. To the contrary, normal facial temperature distribution is not conformed if temperature of the feature area is higher than temperature of neighboring sub-area. In the embodiment, the predetermined facial temperature distribution (step 357) may include one or more of the following requirements:
Subsequently, in step 354, the recognition area is divided into sub-areas (e.g., 5×7 sub-areas). In step 355, a feature area (that includes at least one sub-area) is selected in the temperature image. For example, a sub-area with low temperature in the middle of the temperature image is selected as a feature area covering the nose.
In another example, the recognition area is not divided into rectangular sub-areas in step 354. Instead, the nose may be located in an area with low temperature in the middle of the temperature image, followed by locating left and right eyes in an area with higher temperature above the nose. A feature area (i.e., a triangular area) may be defined by the located nose and eyes (step 355).
Subsequently, in step 356, it compares temperature of the feature area with temperature of at least one neighboring sub-area. Finally, in step 357, if comparing result (of step 356) conforms to predetermined (or normal) facial temperature distribution, the live facial recognition criterion is asserted or conformed.
Subsequently, in step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36). On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 35.
In step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37). The live facial recognition criterion of step 35 may be referred to the detailed flow diagram of
In step 31, the image capture device 12 (e.g., a color camera) may capture a visual image (e.g., a color image) of a person under recognition, according to which facial feature values may be obtained. Next, in step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36). On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 32B.
In step 32B, according to facial positions detected in step 31, the temperature sensor (e.g., an infrared temperature sensor or an infrared thermometer) may adopt remote sensing to obtain plural temperature data representing the temperature information at specific points of a face.
In step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37). The live facial recognition criterion of step 35 may be referred to the detailed flow diagram of
Although specific embodiments have been illustrated and described, it will be appreciated by those skilled in the art that various modifications may be made without departing from the scope of the present invention, which is intended to be limited solely by the appended claims.
Chen, Yin-Yu, Chang, Yao-Tsung
Patent | Priority | Assignee | Title |
11328152, | Jun 17 2019 | Pixart Imaging Inc. | Recognition system employing thermal sensor |
Patent | Priority | Assignee | Title |
6173068, | Jul 29 1996 | MIKOS, LTD | Method and apparatus for recognizing and classifying individuals based on minutiae |
6996256, | Jun 08 2000 | Mayo Foundation for Medical Education and Research | Detection system and method using thermal image analysis |
9202105, | Jan 13 2012 | Amazon Technologies, Inc.; Amazon Technologies, Inc | Image analysis for user authentication |
20030108223, | |||
20060102843, | |||
20100128938, | |||
20120148118, | |||
20130342703, | |||
20140099005, | |||
20180039845, | |||
TW201033907, | |||
TW201401186, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Nov 16 2017 | CHANG, YAO-TSUNG | Wistron Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 044354 | /0275 | |
Nov 17 2017 | CHEN, YIN-YU | Wistron Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 044354 | /0275 | |
Dec 11 2017 | Wistron Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Dec 11 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jul 11 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 18 2023 | 4 years fee payment window open |
Aug 18 2023 | 6 months grace period start (w surcharge) |
Feb 18 2024 | patent expiry (for year 4) |
Feb 18 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 18 2027 | 8 years fee payment window open |
Aug 18 2027 | 6 months grace period start (w surcharge) |
Feb 18 2028 | patent expiry (for year 8) |
Feb 18 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 18 2031 | 12 years fee payment window open |
Aug 18 2031 | 6 months grace period start (w surcharge) |
Feb 18 2032 | patent expiry (for year 12) |
Feb 18 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |