A live facial recognition method and system includes capturing a visual image and obtaining temperature information of a person under recognition, and deriving facial features according to the visual image; comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value.

Patent
   10565461
Priority
Oct 16 2017
Filed
Dec 11 2017
Issued
Feb 18 2020
Expiry
Jul 12 2038
Extension
213 days
Assg.orig
Entity
Large
1
12
currently ok
8. A live facial recognition method, comprising:
capturing a visual image and obtaining temperature information of a person under recognition, facial features being derived according to the visual image;
comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; and
determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
normalizing a temperature information to catch a recognition area in the temperature information;
dividing the recognition area into a plurality of sub-areas;
selecting a feature area in the temperature information, the feature area including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
17. A live facial recognition system, comprising:
a processor;
an image capture device that is controlled by the processor to capture a visual image of a person under recognition, facial features being derived according to the visual image;
a temperature detection device that is controlled by the processor to obtain temperature information of the person under recognition; and
a storage device that stores a facial feature database;
wherein the processor compares the facial features of the visual image with corresponding facial features of the facial feature database to obtain a difference therebetween; and the processor determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
normalizing a temperature information to catch a recognition area in the temperature information;
dividing the recognition area into a plurality of sub-areas;
selecting a feature area in the temperature information, the feature area including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
1. A live facial recognition method, comprising:
capturing a visual image and obtaining temperature information of a person under recognition, facial features being derived according to the visual image;
comparing the facial features of the visual image with corresponding facial features of a facial feature database to obtain a difference therebetween; and
determining whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
performing mapping between the visual image and the temperature information;
normalizing the visual image to catch a recognition area in the visual image;
catching a corresponding recognition area in the temperature information according to mapping relationship between the visual image and the temperature information;
dividing the recognition area of the temperature information into a plurality of sub-areas;
selecting a feature area in the visual image and a corresponding feature area in the temperature information, the feature area of the temperature information including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
10. A live facial recognition system, comprising:
a processor;
an image capture device that is controlled by the processor to capture a visual image of a person under recognition, facial features being derived according to the visual image;
a temperature detection device that is controlled by the processor to obtain temperature information of the person under recognition; and
a storage device that stores a facial feature database;
wherein the processor compares the facial features of the visual image with corresponding facial features of the facial feature database to obtain a difference therebetween; and the processor determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information, if the difference is less than a predetermined threshold value;
wherein the live facial recognition criterion comprises:
performing mapping between the visual image and the temperature information;
normalizing the visual image to catch a recognition area in the visual image;
catching a corresponding recognition area in the temperature information according to mapping relationship between the visual image and the temperature information;
dividing the recognition area of the temperature information into a plurality of sub-areas;
selecting a feature area in the visual image and a corresponding feature area in the temperature information, the feature area of the temperature information including at least one said sub-area;
comparing temperature of the feature area of the temperature information with temperature of at least one neighboring sub-area; and
asserting the live facial recognition criterion if comparing result conforms to predetermined facial temperature distribution.
2. The method of claim 1, wherein the predetermined facial temperature distribution comprises one or more of the following requirements:
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
3. The method of claim 1, wherein the temperature information comprises a temperature image, the visual image and the temperature image are captured by an image capture device and a temperature detection device respectively, and the image capture device is separate from the temperature detection device.
4. The method of claim 1, wherein the temperature information comprises a temperature image, the visual image and the temperature image are respectively captured by an image capture device and a temperature detection device at a same time, and the image capture device and the temperature detection device are integrated.
5. The method of claim 1, wherein the temperature information comprises plural temperature data, the visual image and the plural temperature data are captured by an image capture device and a temperature detection device respectively, and the temperature detection device includes a single temperature sensor that is separate from the image capture device.
6. The method of claim 5, wherein the temperature sensor obtains the plural temperature data at specific points of a face according to facial positions detected in the visual image.
7. The method of claim 1, wherein the facial feature database is generated by the following steps:
capturing an image of a person under test, resulting in a captured image;
performing face detection on the captured image;
catching a face image from the captured image according to results of the face detection;
extracting facial features from the face image;
numericalizing the facial features to generate facial feature values; and
building a model according to the facial feature values, thereby generating the facial feature database.
9. The method of claim 8, wherein the predetermined facial temperature distribution comprises one or more of the following requirements:
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
11. The system of claim 10, further comprising a memory unit that stores a computer program, by which the processor performs live facial recognition.
12. The system of claim 10, wherein the predetermined facial temperature distribution comprises one or more of the following requirements:
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.
13. The system of claim 10, wherein the image capture device is separate from the temperature detection device, and the temperature detection device obtains a temperature image representing the temperature information.
14. The system of claim 10, wherein the image capture device and the temperature detection device are integrated, and respectively capture the visual image and a temperature image at a same time, the temperature image representing the temperature information.
15. The system of claim 10, wherein the image capture device is separate from the temperature detection device, and the temperature detection device includes a single temperature sensor.
16. The system of claim 15, wherein the temperature sensor obtains plural temperature data at specific points of a face according to facial positions detected in the visual image, the plural temperature data representing the temperature information.
18. The system of claim 17, wherein the predetermined facial temperature distribution comprises one or more of the following requirements:
facial temperature is higher than ambient temperature;
temperature above eyes is higher than temperature below the eyes;
above the eyes, eyebrow temperature is lower than temperature of other area;
below the eyes, nose temperature is lower than temperature of other area;
the facial temperature is higher than 28° C.;
eye temperature is the highest in a face;
nose temperature is the lowest in the face;
temperature above nose is higher than temperature below the nose; and
temperature of the eyes with glasses is lower than other temperature in the face, but higher than the ambient temperature.

This application claims priority of Taiwan Patent Application No. 106135308, filed on Oct. 16, 2017, the entire contents of which are herein expressly incorporated by reference.

1. Field of the Invention

The present invention generally relates to facial recognition, and more particularly to a live facial recognition method and system.

2. Description of Related Art

Facial recognition is computer image processing capable of identifying facial features from a digital image or a video frame, and could be used as a security measure. Facial recognition is one of biometrics such as fingerprint or eye iris recognition. Facial recognition may be adapted to electronic devices such as computers, mobile phones and card readers. Particularly, as mobile devices are becoming more popular, the security measure is in high demand.

A conventional facial recognition system uses a two-dimensional (2D) camera to capture an image, from which facial features are extracted and compared with a database. However, the conventional facial recognition system usually cannot distinguish a real person from a picture while performing recognition, becoming a security loophole to be exploited.

In order to enhance reliability of the security measure, a facial recognition system is proposed to ask a user to act according to a given instruction such as swinging or rotating head, opening mouth or closing eyes. Further, some images may be captured while the user is acting on instruction, and accordingly depth information may be obtained and used to identify a real person. Nevertheless, those schemes take time and cause inconvenient.

A need has thus arisen to propose a novel facial recognition scheme capable of maintaining or enhancing reliability of the security measure, and accelerating facial recognition with convenience.

In view of the foregoing, it is an object of the embodiment of the present invention to provide a live facial recognition method and system capable of quick recognizing a face accurately and conveniently.

According to one embodiment, a visual image and temperature information of a person under recognition are captured, and facial features are derived according to the visual image. The facial features of the visual image are compared with corresponding facial features of a facial feature database to obtain a difference therebetween. If the difference is less than a predetermined threshold value, it determines whether facial temperature conforms to a live facial recognition criterion according to the temperature information.

FIG. 1 shows a block diagram illustrating a live facial recognition system according to one embodiment of the present invention;

FIG. 2A shows an exemplary pixel arrangement of an image sensor in a 2D camera;

FIG. 2B shows an exemplary pixel arrangement of an image sensor in a red-green-blue and infrared (RGB-IR) camera;

FIG. 3 shows a flow diagram illustrating a method of generating a database;

FIG. 4 shows exemplary facial features according to one embodiment of the present invention;

FIG. 5 shows a flow diagram illustrating a live facial recognition method according to a first embodiment of the present invention;

FIG. 6 shows a detailed flow diagram of the live facial recognition criterion of FIG. 5;

FIG. 7 shows another detailed flow diagram of the live facial recognition criterion of FIG. 5;

FIG. 8 shows a flow diagram illustrating a live facial recognition method according to a second embodiment of the present invention; and

FIG. 9 shows a flow diagram illustrating a live facial recognition method according to a third embodiment of the present invention.

FIG. 1 shows a block diagram illustrating a live facial recognition system 100 according to one embodiment of the present invention. The live facial recognition system (“recognition system” hereinafter) 100 of the embodiment may include a processor 11, an image capture device 12, a temperature detection device 13, a memory unit 14 and a storage device 15. Specifically, the processor 11 (e.g., a digital image processor) may include circuitry such as an integrated circuit. The image capture device 12, controlled by the processor 11, may be configured to capture an image of a person under test. The image capture device 12 may be a two-dimensional (2D) camera for capturing a 2D image. Alternatively, the image capture device 12 may be a three-dimensional (3D) camera for obtaining depth information in addition to a 2D image. In one example, the 3D camera may be composed of two lenses with disparity, according to which the depth information may be obtained. In another example, the 3D camera may be composed of a 2D camera and a depth detection device.

In the embodiment, the temperature detection device 13, controlled by the processor 11, may be configured to detect temperature in a specific area in order to obtain temperature information. The temperature detection device 13 may include a temperature sensor array (e.g., an infrared temperature sensor array) that is capable of obtaining a temperature image representing the temperature information of an entire face. The temperature sensor array is also called a temperature camera (e.g., an infrared camera) in this specification. Alternatively, the temperature detection device 13 may include a single temperature sensor (e.g., an infrared temperature sensor or an infrared thermometer) that is capable of detecting plural temperature data representing the temperature information at specific points of a face. The single temperature sensor may adopt remote sensing to detect one point at a time. In one embodiment, the temperature detection device 13 is embedded in the image capture device 12 (i.e., the image capture device 12 and the temperature detection device 13 are integrated), and therefore the temperature information and the visual image can be obtained at the same time. FIG. 2A shows an exemplary pixel arrangement of an image sensor in a 2D camera, illustrating red pixels (R), green pixels (G) and blue pixels (B) arranged in Bayer array. FIG. 2B shows an exemplary pixel arrangement of an image sensor in a red-green-blue and infrared (RGB-IR) camera, illustrating infrared pixels (IR) interposed among red pixels (R), green pixels (G) and blue pixels (B) together arranged in Bayer array.

In the embodiment, the memory unit 14 may be configured to store a computer program and data for executing the computer program, by which the processor 11 may perform live facial recognition. The memory unit 14 may include a random access memory such as a dynamic random access memory (DRAM), a static random access memory (SRAM) or other memory units suitable for storing a computer program. The storage device 15 of the embodiment may be configured to store a database adaptable to live facial recognition. The storage device 15 may include a hard disk, a solid state disk (SSD) or other storage devices suitable for storing a database.

Prior to live facial recognition, a facial feature database should be generated through registry and modeling. Afterwards, live facial recognition may be performed by comparing the facial features of a person under recognition with the facial feature database.

FIG. 3 shows a flow diagram illustrating a method 200 of generating a database. Specifically, in step 21, the image capture device 12 may capture an image of a person under test. Next, in step 22, the processor 11 may perform face detection on the captured image. In step 23, a face image (i.e., an image substantially covering a facial contour) may be caught from the captured image according to results of the face detection. Subsequently, in step 24, the processor 11 may extract or derive facial features from the face image. FIG. 4 shows exemplary facial features according to one embodiment of the present invention, including a distance 41 between the eyes, a width 42 of the nose, depth 43 of an eye socket, a structure 44 of cheekbones, a length 45 of a jaw line, and a chin point 46. As exemplified in FIG. 4, the facial features may be composed of lines, circles, rectangles, triangles, etc. Facial recognition may be performed according to connecting lines, distances and angles between feature points. In another embodiment, facial features may include fewer or more entries.

In step 25, the processor 11 may numericalize the facial features to generate facial feature values. Next, in step 26, a model is built according to the facial feature values, and a facial feature database is accordingly generated and stored in the storage device 15 (step 27).

FIG. 5 shows a flow diagram illustrating a live facial recognition method 300 according to a first embodiment of the present invention. In step 31, the image capture device 12 (e.g., a color camera) may capture a visual image (e.g., a color image) of a person under recognition, according to which facial feature values may be obtained. The facial feature values may be obtained in a manner similar to steps 22-25 of FIG. 3. Briefly speaking, the processor 11 may perform face detection on a captured visual image, catch a face image, extract plural facial features, and then numericalize the facial features to generate the facial feature values.

In step 32, the temperature detection device 13 (in the embodiment, a temperature camera such as an infrared camera) may capture a temperature image (e.g., an infrared image) of the person under recognition. In the embodiment, the temperature detection device 13 is separate from the image capture device 12, and the temperature image may represent the temperature information. In general, step 32 may be performed at any time prior to step 35. The information obtained by the temperature camera (e.g., the infrared camera) may be transformed into the temperature image. For example, intensity of an infrared detection value may be transformed into a corresponding temperature.

In step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36).

On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 35.

According to one aspect of the embodiment, in step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37).

FIG. 6 shows a detailed flow diagram of the live facial recognition criterion (step 35) of FIG. 5. In step 351, mapping is performed between the visual image (from the image capture device 12) and the temperature image (from the temperature detection device 13). Specifically, a common reference point of the visual image and the temperature image is first determined, and magnifications of the visual image and the temperature image are then determined according to their capture angles respectively. Accordingly, pixels of the visual image and the temperature image may be correspondingly mapped.

Subsequently, in step 352, the visual image may be normalized to catch a recognition area. In one example, a recognition area (e.g., a rectangular area) that substantially covers a facial contour may be caught in the visual image. For example, a rectangular recognition area is defined by two horizontal lines respectively passing top facial contour edge and bottom facial contour edge and two vertical lines respectively passing right facial contour edge and left facial contour edge. In another example, a recognition area is caught by removing pixels with temperature being lower than an average temperature in the temperature image. If it determines that the person under recognition wears a mask (which causes temperature below the nose to be lower than temperature at the nose) or wears glasses (which causes temperature at the eyes to be lower than temperature at the nose), the corresponding area may also be removed.

In step 353, a corresponding recognition area is caught in the temperature image according to the mapping relationship between the visual image and the temperature image. In step 354, the recognition area of the temperature image is divided into sub-areas (e.g., 5×5 sub-areas). In another example, sub-areas may be obtained according to feature points shown in FIG. 4. For example, a triangular sub-area may be defined by three adjacent feature points, and a rectangular sub-area may be defined by four adjacent feature points.

In step 355, a feature area (that includes at least one sub-area) is selected in the visual image, and a corresponding feature area is then obtained in the temperature image. For example, a feature area of the temperature image that includes at least one sub-area covering a face organ (e.g., the nose) is obtained. Subsequently, in step 356, it compares temperature of the feature area of the temperature image with temperature of at least one neighboring sub-area. The at least one neighboring sub-area may include a top neighboring sub-area and a bottom neighboring sub-area; a top neighboring sub-area, a bottom neighboring sub-area, a left neighboring sub-area and a right neighboring sub-area; three top neighboring sub-areas and three bottom neighboring sub-areas; or eight neighboring sub-areas along eight directions respectively. In this specification, temperature of an area may refer to a mean temperature or a medium temperature of the area, or may refer to a mean value, a maximum value or a minimum value of some intermediate temperatures.

Finally, in step 357, if comparing result (of step 356) conforms to predetermined (or normal) facial temperature distribution, the live facial recognition criterion is asserted or conformed. In this specification, the predetermined facial temperature distribution may refer to a mean value or a medium value among plural people, or may refer to a mean value of some intermediate values among plural people. Take the feature area covering the nose as an example, normal facial temperature distribution is conformed if temperature of the feature area is lower than temperature of neighboring sub-area. To the contrary, normal facial temperature distribution is not conformed if temperature of the feature area is higher than temperature of neighboring sub-area. In the embodiment, the predetermined facial temperature distribution (step 357) may include one or more of the following requirements:

FIG. 7 shows another detailed flow diagram of the live facial recognition criterion (step 35) of FIG. 5. In the embodiment, only the temperature image (from the temperature detection device 13) is used but not the visual image (from the image capture device 12). Therefore, mapping between the visual image and the temperature image need not be performed. In step 352B, the temperature image may be normalized to catch a recognition area. In one example, a recognition area is caught by removing pixels with temperature being lower than an average temperature in the temperature image. If it determines that the person under recognition wears a mask or glasses, the corresponding area may also be removed.

Subsequently, in step 354, the recognition area is divided into sub-areas (e.g., 5×7 sub-areas). In step 355, a feature area (that includes at least one sub-area) is selected in the temperature image. For example, a sub-area with low temperature in the middle of the temperature image is selected as a feature area covering the nose.

In another example, the recognition area is not divided into rectangular sub-areas in step 354. Instead, the nose may be located in an area with low temperature in the middle of the temperature image, followed by locating left and right eyes in an area with higher temperature above the nose. A feature area (i.e., a triangular area) may be defined by the located nose and eyes (step 355).

Subsequently, in step 356, it compares temperature of the feature area with temperature of at least one neighboring sub-area. Finally, in step 357, if comparing result (of step 356) conforms to predetermined (or normal) facial temperature distribution, the live facial recognition criterion is asserted or conformed.

FIG. 8 shows a flow diagram illustrating a live facial recognition method 400 according to a second embodiment of the present invention. The present embodiment is similar to the first embodiment (FIG. 5) but the image capture device 12 and the temperature detection device 13 are integrated, for example, as an RGB-IR camera. Therefore, in step 31B, the visual image and the temperature image (that represents the temperature information) are obtained at the same time. As described above, intensity of an infrared detection value may be transformed into a corresponding temperature.

Subsequently, in step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36). On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 35.

In step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37). The live facial recognition criterion of step 35 may be referred to the detailed flow diagram of FIG. 6 or FIG. 7 (but no need of performing mapping of step 351), details of which are omitted for brevity.

FIG. 9 shows a flow diagram illustrating a live facial recognition method 500 according to a third embodiment of the present invention. The present embodiment is similar to the first embodiment (FIG. 5), in which the image capture device 12 is separate from the temperature detection device 12. Moreover, the temperature detection device 13 in the embodiment may include a single temperature sensor (e.g., an infrared temperature sensor or an infrared thermometer).

In step 31, the image capture device 12 (e.g., a color camera) may capture a visual image (e.g., a color image) of a person under recognition, according to which facial feature values may be obtained. Next, in step 33, the facial feature values of the person under recognition are compared with a facial feature database (database for short). If a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is not less than a predetermined threshold value (step 34), indicating that they have substantially different facial features (i.e., a facial feature error is outside an acceptable range), the facial recognition fails (step 36). On the contrary, if a difference between the facial feature values of the person under recognition and corresponding facial feature values of the database is less than the predetermined threshold value (step 34), indicating that they have substantially similar facial features (i.e., a facial feature error is inside an acceptable range), the flow then goes to step 32B.

In step 32B, according to facial positions detected in step 31, the temperature sensor (e.g., an infrared temperature sensor or an infrared thermometer) may adopt remote sensing to obtain plural temperature data representing the temperature information at specific points of a face.

In step 35, it determines whether facial temperature of the person under recognition conforms to a live facial recognition criterion. If the determination of step 35 is negative, the facial recognition fails (step 36); if the determination of step 35 is positive, the facial recognition succeeds (step 37). The live facial recognition criterion of step 35 may be referred to the detailed flow diagram of FIG. 7 (but replacing the temperature image with the temperature data), details of which are omitted for brevity.

Although specific embodiments have been illustrated and described, it will be appreciated by those skilled in the art that various modifications may be made without departing from the scope of the present invention, which is intended to be limited solely by the appended claims.

Chen, Yin-Yu, Chang, Yao-Tsung

Patent Priority Assignee Title
11328152, Jun 17 2019 Pixart Imaging Inc. Recognition system employing thermal sensor
Patent Priority Assignee Title
6173068, Jul 29 1996 MIKOS, LTD Method and apparatus for recognizing and classifying individuals based on minutiae
6996256, Jun 08 2000 Mayo Foundation for Medical Education and Research Detection system and method using thermal image analysis
9202105, Jan 13 2012 Amazon Technologies, Inc.; Amazon Technologies, Inc Image analysis for user authentication
20030108223,
20060102843,
20100128938,
20120148118,
20130342703,
20140099005,
20180039845,
TW201033907,
TW201401186,
///
Executed onAssignorAssigneeConveyanceFrameReelDoc
Nov 16 2017CHANG, YAO-TSUNGWistron CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0443540275 pdf
Nov 17 2017CHEN, YIN-YUWistron CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0443540275 pdf
Dec 11 2017Wistron Corporation(assignment on the face of the patent)
Date Maintenance Fee Events
Dec 11 2017BIG: Entity status set to Undiscounted (note the period is included in the code).
Jul 11 2023M1551: Payment of Maintenance Fee, 4th Year, Large Entity.


Date Maintenance Schedule
Feb 18 20234 years fee payment window open
Aug 18 20236 months grace period start (w surcharge)
Feb 18 2024patent expiry (for year 4)
Feb 18 20262 years to revive unintentionally abandoned end. (for year 4)
Feb 18 20278 years fee payment window open
Aug 18 20276 months grace period start (w surcharge)
Feb 18 2028patent expiry (for year 8)
Feb 18 20302 years to revive unintentionally abandoned end. (for year 8)
Feb 18 203112 years fee payment window open
Aug 18 20316 months grace period start (w surcharge)
Feb 18 2032patent expiry (for year 12)
Feb 18 20342 years to revive unintentionally abandoned end. (for year 12)