A method, apparatus and computer program product are provided for determining spatial location for one or more facial features. A method computes features for an initial frame. The computed features of the initial frame generate a feature image. A method also determines whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. A method also includes a face search, using a portion of the feature image, for one or more facial features, wherein the portion of the feature image searched is a fraction of the total number of frames analyzed in a feature computation cycle. A method also determines a spatial location for the one or more facial features detected in the intermediate frame.
|
1. A method comprising:
computing features for an initial frame, wherein the computed features of the initial frame generate a feature image;
determining whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified if a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level, and wherein determining whether the translation is verified comprises computing the translation using correlation and sum of absolute differences of a horizontal integral projection and a vertical integral projection;
performing a face search, using the computed features of the initial frame and a portion of the intermediate frame, for one or more facial features, wherein the portion of the intermediate frame searched is a fraction of the total number of frames analyzed in a feature computation cycle; and
determining a spatial location for the one or more facial features detected in the searched portion of the intermediate frame.
13. A non-transitory computer readable product with a computer program comprising program code, which when executed by an apparatus cause the apparatus at least to:
compute features for an initial frame, wherein the computed features of the initial frame generate a feature image;
determine whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level, and wherein the apparatus is caused to determine whether the translation is verified by computing the translation using correlation and sum of absolute differences of a horizontal integral projection and a vertical integral projection;
perform a face search, using the computed features of the initial frame and portion of the intermediate frame, for one or more facial features, wherein the portion of the intermediate frame searched is a fraction of the total number of frames analyzed in a feature computation cycle; and
determine a spatial location for the one or more facial features detected in the searched portion of the intermediate frame.
8. An apparatus comprising a processor and a memory including software, the memory and the software configured to, with the processor, cause the apparatus to at least:
compute features for an initial frame, wherein the computed features of the initial frame generate a feature image;
determine whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified if a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level, and wherein the apparatus is caused to determine whether the translation is verified by computing the translation using correlation and sum of absolute differences of a horizontal integral projection and a vertical integral projection;
in an instance in which the determined translation is verified, perform a face search using the computed features of the initial frame and a portion of the intermediate frame for one or more facial features, wherein the portion of the intermediate frame searched is a fraction of the total number of frames analyzed in a feature computation cycle; and
determine a spatial location for the one or more facial features detected in the searched portion of the intermediate frame.
2. The method according to
determining the horizontal integral projection based on the horizontal gradients of the luminance plane for the initial frame and the intermediate frame;
determining the vertical integral projection based on the vertical gradients of the luminance plane for the initial frame and the intermediate frame;
determining the horizontal translation of the intermediate frame in a horizontal direction using the vertical integral projection; and
determining the vertical translation of the intermediate frame using the horizontal integral projection.
3. The method according to
searching a portion of the feature image at a final frame for the one or more facial features;
determining a spatial location for the one or more facial features located in the searched portion of the final frame; and
providing a next frame as the initial frame; wherein the next frame restarts the feature computation cycle.
4. The method according to
5. The method according to
6. The method according to
7. The method according to
searching a subwindow in the feature image at first intermediate frame starting at a first pixel location; and
searching a subwindow in the feature image at a second intermediate frame starting with a second pixel location, wherein the second pixel location is offset from the first pixel using the determined translation.
9. The apparatus according to
determine the horizontal integral projection based on the horizontal gradients of the luminance plane for the initial frame and the intermediate frame;
determine the vertical integral projection based on the vertical gradients of the luminance plane for the initial frame and the intermediate frame;
determine the vertical translation of the intermediate frame in a horizontal direction using the horizontal integral projection; and
determine the horizontal translation of the intermediate frame using the vertical integral projection.
10. The apparatus according to
11. The apparatus according to
12. The apparatus according to
search a subwindow in the feature image at first intermediate frame starting at a first pixel location; and
search a subwindow in the feature image at a second intermediate frame starting with a second pixel location, wherein the second pixel location is offset from the first pixel using the determined translation.
14. The non-transitory computer readable product with the computer program according to
determine the horizontal integral projection for the initial frame and the intermediate frame based on the horizontal gradients of the luminance plane;
determine the vertical integral projection for the initial frame and the intermediate frame based on the vertical gradients of the luminance plane;
determine the vertical translation of the intermediate frame in a horizontal direction using the horizontal integral projection; and
determine the horizontal translation of the intermediate frame using the vertical integral projection.
15. The non-transitory computer readable product with the computer program according to
16. The non-transitory computer readable product with the computer program according to
17. The non-transitory computer readable product with the computer program according to
search a subwindow in the feature image at first intermediate frame starting at a first pixel location; and
search a subwindow in the feature image at a second intermediate frame starting with a second pixel location, wherein the second pixel location is offset from the first pixel using the determined translation.
|
This application was originally filed as Patent Cooperation Treaty Application No. PCT/FI12012/050760 filed on Aug. 2, 2012 which claims priority benefit to Indian Patent Application No. 2947/CHE/2011, filed Aug. 29, 2011.
Example embodiments relate generally to feature computation and, more particularly, to feature computation utilizing temporal redundancy between video frames to reduce computational intensity.
In various image processing applications, it may be desirable to track an object, such as a feature (e.g., face, facial feature, etc.), between successive frames in a video. In order to track a feature from one frame to the next, each frame may be analyzed to determine the new location of the feature. However, analyzing each frame may be a computationally intensive process which may be a challenge, at least for those devices with limited computational resources, to perform in an efficient and timely manner.
Feature tracking may be computationally intensive for various reasons. For example, some feature tracking techniques analyze each entire frame or at least a relatively large portion of each frame. As such, it would be desirable to provide an improved technique for feature tracking between frames, such as frames of a video that provides accurate results with reduced computational requirements.
A method, apparatus and computer program product are provided in accordance with an example embodiment in order to provide an improved technique for feature computation and facial searching. In this regard, the method, apparatus and computer program product of example embodiment may provide for feature computation in a manner that reduces computational requirements while continuing to provide reliable and robust feature computation. Indeed, the method, apparatus and computer program product of one example embodiment may provide for computing features in an image frame by exploiting the amount of overlap across a plurality of image frames.
In an embodiment, a method computes features for an initial frame. The computed features of the initial frame generate a feature image. A method also determines whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. A method also includes a face search, using a portion of the feature image, for one or more facial features, wherein the portion of the feature image searched is a fraction of the total number of frames analyzed in a feature computation cycle. A method also determines a spatial location for the one or more facial features detected in the intermediate frame.
In an embodiment, a method further determines the translation by determining a horizontal integral projection based on the horizontal gradients of the luminance plane for the initial frame and the intermediate frame. A method also determines a vertical integral projection based on the vertical gradients of the luminance plane for the initial frame and the intermediate frame. A method also determines the horizontal translation of the intermediate frame in a horizontal direction using the vertical integral projection. A method also determines the vertical translation of the intermediate frame using the horizontal integral projection.
In an embodiment, a method further continues through a feature computation cycle by searching a portion of the feature image at a final frame for the one or more facial features. A method also determines a spatial location for the one or more facial features located in the searched portion of the final frame. A method also identifies a next frame as the initial frame; wherein the next frame restarts the feature computation cycle.
In another embodiment, an apparatus comprising a processor and a memory including software, the memory and the software configured to, with the processor, cause the apparatus to at least to compute features for an initial frame. The computed features of the initial frame generate a feature image. The apparatus is further caused to determine whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. The apparatus is further caused to perform a face search using a portion of the feature image for one or more facial features, wherein the portion of the feature image searched is a fraction of the total number of frames analyzed in a feature computation cycle. The apparatus is further caused to determine a spatial location for the one or more facial features detected in the intermediate frame.
In a further embodiment, a computer program product comprising at least one computer readable non-transitory memory having program code stored thereon, the program code which when executed by an apparatus cause the apparatus at least to compute features for an initial frame. The computed features of the initial frame generate a feature image. A computer program product is further configured to determine whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. A computer program product is further configured to perform a face search using a portion of the feature image for one or more facial features, wherein the portion of the feature image searched is a fraction of the total number of frames analyzed in a feature computation cycle. A computer program product is further configured to determine a spatial location for the one or more facial features detected in the intermediate frame.
In yet another embodiment, an apparatus is provided that includes means for computing features for an initial frame. The computed features of the initial frame generate a feature image. An apparatus further comprises means for determining whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. An apparatus further comprises means for performing a face search, using a portion of the feature image, for one or more facial features, wherein the portion of the feature image searched is a fraction of the total number of frames analyzed in a feature computation cycle. An apparatus further comprises means for determining a spatial location for the one or more facial features detected in the intermediate frame.
Having thus described certain example embodiments in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
Some example embodiments will now be described more fully hereinafter with reference to the accompanying drawings. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
As used herein, the terms “data,” “content,” “information” and similar terms may be used interchangeably to refer to data capable of being transmitted, received, displayed and/or stored in accordance with various example embodiments. Thus, use of any such terms should not be taken to limit the spirit and scope of the disclosure.
The term “computer-readable medium” as used herein refers to any medium configured to participate in providing information to a processor, including instructions for execution. Such a medium may take many forms, including, but not limited to a non-transitory computer-readable storage medium (for example, non-volatile media, volatile media), and transmission media. Transmission media include, for example, coaxial cables, copper wire, fiber optic cables, and carrier waves that travel through space without wires or cables, such as acoustic waves and electromagnetic waves, including radio, optical and infrared waves. Examples of non-transitory computer-readable media include a floppy disk, hard disk, magnetic tape, any other non-transitory magnetic medium, a compact disc read only memory (CD-ROM), compact disc compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-Ray, any other non-transitory optical medium, a random access memory (RAM), a programmable read only memory (PROM), an erasable programmable read only memory (EPROM), a FLASH-EPROM, any other memory chip or cartridge, or any other non-transitory medium from which a computer can read. The term computer-readable storage medium is used herein to refer to any computer-readable medium except transmission media. However, it will be appreciated that where embodiments are described to use a computer-readable storage medium, other types of computer-readable mediums may be substituted for or used in addition to the computer-readable storage medium in alternative embodiments.
Additionally, as used herein, the term ‘circuitry’ refers to (a) hardware-only circuit implementations (for example, implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
An apparatus 10 for performing feature computation including in accordance with one example embodiment of the present invention is shown in
Referring now to
The apparatus 10 may, in some embodiments, be a mobile terminal or other computing device configured to employ an example embodiment of the present invention. However, in some embodiments, the apparatus may be embodied as a chip or chip set. In other words, the apparatus may comprise one or more physical packages (e.g., chips) including materials, components and/or wires on a structural assembly (e.g., a baseboard). The structural assembly may provide physical strength, conservation of size, and/or limitation of electrical interaction for component circuitry included thereon. The apparatus may therefore, in some cases, be configured to implement an embodiment of the present invention on a single chip or as a single “system on a chip.” As such, in some cases, a chip or chipset may constitute means for performing one or more operations for providing the functionalities described herein.
The processor 12 may be embodied in a number of different ways. For example, the processor may be embodied as one or more of various hardware processing means such as a coprocessor, a microprocessor, a controller, a digital signal processor (DSP), a processing element with or without an accompanying DSP, or various other processing circuitry including integrated circuits such as, for example, an ASIC (application specific integrated circuit), an FPGA (field programmable gate array), a microcontroller unit (MCU), a hardware accelerator, a special-purpose computer chip, or the like. As such, in some embodiments, the processor may include one or more processing cores configured to perform independently. A multi-core processor may enable multiprocessing within a single physical package. Additionally or alternatively, the processor may include one or more processors configured in tandem via the bus to enable independent execution of instructions, pipelining and/or multithreading.
In an example embodiment, the processor 12 may be configured to execute instructions stored in the memory device 14 or otherwise accessible to the processor. Alternatively or additionally, the processor may be configured to execute hard coded functionality. As such, whether configured by hardware or software methods, or by a combination thereof, the processor may represent an entity (e.g., physically embodied in circuitry) capable of performing operations according to an embodiment of the present invention while configured accordingly. Thus, for example, when the processor is embodied as an ASIC, FPGA or the like, the processor may be specifically configured hardware for conducting the operations described herein. Alternatively, as another example, when the processor is embodied as an executor of software instructions, the instructions may specifically configure the processor 102 to perform the algorithms and/or operations described herein when the instructions are executed. However, in some cases, the processor may be a processor of a specific device (e.g., a mobile terminal or other computing device), such as processor of a mobile terminal, adapted for employing an embodiment of the present invention by further configuration of the processor by instructions for performing the algorithms and/or operations described herein. The processor may include, among other things, a clock, an arithmetic logic unit (ALU) and logic gates configured to support operation of the processor.
Meanwhile, the communication interface 16 may be any means such as a device or circuitry embodied in either hardware or a combination of hardware and software that is configured to receive and/or transmit data from/to a network and/or any other device or module in communication with the apparatus 10. In this regard, the communication interface may include, for example, an antenna (or multiple antennas) and supporting hardware and/or software for enabling communications with a wireless communication network. Additionally or alternatively, the communication interface may include the circuitry for interacting with the antenna(s) to cause transmission of signals via the antenna(s) or to handle receipt of signals received via the antenna(s). In some environments, the communication interface may alternatively or also support wired communication. As such, for example, the communication interface may include a communication modem and/or other hardware/software for supporting communication via cable, digital subscriber line (DSL), universal serial bus (USB) or other mechanisms.
The user interface 18 may be in communication with the processor 12 to receive an indication of a user input at the user interface and/or to cause provision of an audible, visual, mechanical or other output to the user. As such, the user interface 18 may include, for example, a keyboard, a mouse, a joystick, a display, a touch screen(s), touch areas, soft keys, a microphone, a speaker, or other input/output mechanisms. Alternatively or additionally, the processor may comprise user interface circuitry configured to control at least some functions of one or more elements of the user interface, such as, for example, a speaker, ringer, microphone, display, and/or the like. The processor and/or user interface circuitry comprising the processor may be configured to control one or more functions of one or more elements of the user interface through computer program instructions (e.g., software and/or firmware) stored on a memory accessible to the processor (e.g., memory device 14, and/or the like).
With reference to
At operation 20, the apparatus 10 may include means, such as the processor 12 or the like, for computing features for an initial frame. As described herein, the computed features for the initial frame may also be referred to as the feature image. An example feature detection method is configured to be trained on 20×20 input samples with local binary pattern (“LBP”) based values of the pixels as its features. The LBP values may be computed on a 20 pixel×20 pixel input sample to provide a 18×18 LBP sample image. Alternatively or additionally other image sizes may be used with embodiments of the current invention. For the input samples of size 20×20, LBP images are computed and a LBP value histogram is obtained for each input sample (among 18×18 possible co-ordinate locations in LBP image) and a coordinate may be selected as a weak classifier. The object/face detection method is configured to take an LPB window as input and classify it. To detect an object/face in an image, the LBP values are computed for the whole image and each window is given to the LBP based classifier to decide whether a feature is present. For example, to perform the initial face detection process in video frames, LBP values are computed for every frame and every subwindow is scanned in raster order for positions and scales.
Using a feature detection method, such as the feature detection method described above, features are computed for the initial frame with respect to operation 20. The translation between a current frame and a next frame, such as for example the initial frame and subsequent, intermediate frames, may be determined. The initial, intermediate, and/or final frames as used herein may relate to a frame instant and/or the like. In this regard, the apparatus 10 may include means, such as the processor 12 or the like, for determining whether a translation is verified between the initial frame and an intermediate frame, wherein a translation is verified in an instance in which a distance used to verify the translation between the initial frame and the intermediate frame is within a predetermined threshold level. See operation 21 of
The determination of the translation motion between two frames such as a current frame and a next frame and in particular between an initial and an intermediate frame is shown with respect to
As is shown in operations 34 and 36 the apparatus 10 may also include means, such as processor 12 or the like, for determining the horizontal translation (Δx)) of the intermediate frame in a horizontal direction using correlation and sum of absolute differences “SAD” measure between the vertical integral projections v1(x) and v2(x); and for determining the vertical translation (Δy) of the intermediate frame using the correlation and SAD between the horizontal integral projection h1(x) and h2(x). For example, the horizontal and vertical translation may be determined as follows:
Δx=arg min x{Σi=[1,W]|v1(i)−v1(x+i)|} where −Woff<=x<=Woff
Δy=arg min x{Σi=[1,H]|h1(i)−h2(x+i)|} where −Hoff<=x<=Hoff
Where W and H are the width and height, respectively, of the video frame and Woff=W/20 and Hoff=H/20
As is shown in operation 38 of
Once the translation is verified successfully, a new initial image frame is analyzed and the feature computation cycle restarts at operation 20 in
Providing that the calculated translation between frames is verified successfully, as was described with reference to
In an example embodiment, feature computation is computed for an initial frame and a partial facial search is performed for each subsequent frame within the feature computation cycle. The computed features from the initial frame are used for face detection and are distributed across the subsequent frames. For example, by reusing the features computed with respect to the initial frame, a partial search may be used across a total of n frames. Thus, the complete feature computation and complete face search for all scales, shifts and poses may be performed over n frames.
Alternatively or additionally, in each frame the entire area of the feature image may be searched using all scales with a coarse step size. In this embodiment, the starting point of the search grid varies such that all the pixel positions are searched in n frames. In a frame a subwindow is scanned with (shift in x direction) Δx=n and Δy=n and the scan starts/origins from pixel position (1, 1). In the next frame a subwindow is scanned with Δx=n and Δy=n but the scan starts/origins from pixel position (2, 2). Continuing this for n frames, most of the positions in the feature image are covered during the feature search. The number of scales to be searched can be distributed across n frames. For example and shown with respect to
Using the search of operation 22, the apparatus 10 may also include means, such as processor 12 or the like, for determining a spatial location for the one or more facial features detected in the feature image as is shown in operation 23 of
The method, apparatus and computer program product of the example embodiment of the present invention as described above in conjunction with
In an example implementation to detect a feature of size 80×80 (4 times the base size of 20×20) of an image of size W×H, which is provide for purposes of illustration but not of limitation, the image is resized by decimating the image to a (W/4×H/4) size. In the new resized window, every subwindow of size 20×20 may be selected with shifts tx and ty in x and y directions being fed to a face detector, such as the face detector described herein. A total of 10 face sizes starting from 200×200 and down to 20×20 are searched in every frame. In this example the scale (M) is M=10. In one embodiment, starting with the larger scale 200×200, the image is resized accordingly and a LBP image is computed. Using this LBP image a face search may be performed. After every face search in one scale, the scale factor is reduced by a factor, such as by a factor of 1.25, and the face search may be performed until the minimum scale size, such as a scale size of 20×20. In an embodiment, this type of face detection will be able to detect faces ranging from 20×20 size to 200×200 size in a 320×240 image.
In general and according to and example embodiment for feature detection, to detect a feature of size S=t*20, where 20<=S<min(W, H), the input image may be resized by skipping pixels (W/t, H/t) and for every subwindow of size 20×20 with shift x and Y in the x and y directions is given as input to the feature detection apparatus and method as described herein. Thus each subwindow is searched in a first intermediate frame starting at a first pixel location and in a subwindow in a second intermediate frame starting with a second pixel location. As described herein, the second pixel location is offset from the first pixel using the determined translation.
At decision block 24, the apparatus 10 may also include means, such as processor 12 or the like, for determining if the translation was verified with respect to Operation 21. If not, as described herein, the feature computation cycle restarts with computing the features for an initial frame as shown with respect to Operation 20. If the translation is verified, then at decision block 25, the apparatus 10 may also include means, such as processor 12 or the like, for determining if a next frame, such as a next intermediate frame, is within n frames of the feature computation cycle as described herein. If the next frame is within n frames of the feature computation cycle, then the next frame becomes the current frame and the method restarts with Operation 21. If the next frame is not within n frames of the feature computation cycle then the method restarts with computing the features for an initial frame as shown with respect to Operation 20.
As described above,
Accordingly, blocks of the flowcharts of
As described herein, a method, apparatus and computer program product are provided in accordance with an example embodiment of the present invention in order to provide an improved technique for feature computation. In this regard, the method, apparatus and computer program product of one embodiment may provide for feature computation in a manner that reduces computational requirements while continuing to provide reliable and robust feature computation. Indeed, the method, apparatus and computer program product of one example embodiment may provide for computing features in an image frame by exploiting the amount of overlap across a plurality of image frames.
Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe some example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
Patent | Priority | Assignee | Title |
11663714, | Dec 17 2017 | Sony Corporation | Image processing apparatus, image processing method, and program |
Patent | Priority | Assignee | Title |
6130912, | Jun 09 1998 | Sony Electronics, INC; Sony Corporation | Hierarchical motion estimation process and system using block-matching and integral projection |
6633655, | Sep 05 1998 | Sharp Kabushiki Kaisha | Method of and apparatus for detecting a human face and observer tracking display |
6711587, | Sep 05 2000 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Keyframe selection to represent a video |
7391907, | Oct 01 2004 | MOTOROLA SOLUTIONS, INC | Spurious object detection in a video surveillance system |
7636454, | Dec 05 2005 | Samsung Electronics Co., Ltd. | Method and apparatus for object detection in sequences |
8098885, | Nov 02 2005 | Microsoft Technology Licensing, LLC | Robust online face tracking |
20050025342, | |||
20060072664, | |||
20090245580, | |||
20100104266, | |||
20100157089, | |||
CN102103694, | |||
EP2256666, | |||
WO2011080599, | |||
WO9951022, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 02 2012 | Nokia Technologies Oy | (assignment on the face of the patent) | / | |||
Feb 17 2014 | MUNINDER, VELDANDI | Nokia Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032684 | /0054 | |
Jan 16 2015 | Nokia Corporation | Nokia Technologies Oy | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 035313 | /0637 |
Date | Maintenance Fee Events |
Jun 26 2017 | ASPN: Payor Number Assigned. |
May 14 2020 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
May 15 2024 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Nov 29 2019 | 4 years fee payment window open |
May 29 2020 | 6 months grace period start (w surcharge) |
Nov 29 2020 | patent expiry (for year 4) |
Nov 29 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 29 2023 | 8 years fee payment window open |
May 29 2024 | 6 months grace period start (w surcharge) |
Nov 29 2024 | patent expiry (for year 8) |
Nov 29 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 29 2027 | 12 years fee payment window open |
May 29 2028 | 6 months grace period start (w surcharge) |
Nov 29 2028 | patent expiry (for year 12) |
Nov 29 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |