The disclosure pertains to aligning an electronic display or light reflected from the electronic display relative to a position of a user (i.e., viewer), and thus enhancing an overall perceived brightness, contrast ratio, and viewing angle performance of an electronic display (e.g., a specular reflective display) irrespective of illumination conditions (e.g., sunlight, airplane lighting, and overhead lamps). In some embodiments, an electronic device may determine user position data for a position of a user with respect to the electronic display. For example, the electronic device may capture images of the user using one or more image sensors. The user position data may then be processed by the electronic device to generate signals which may modify display characteristics of the electronic display to align an electronic display or light reflected from the electronic display relative to the position of the user.
|
1. A method, comprising:
determining, based at least in part on data from an image sensor, position data representing an area where at least a portion of a head of a user is relative to an electronic display;
transmitting, based at least in part on the position data, a first signal to a first mechanism;
transmitting, based at least in part on the position data, a second signal to a second mechanism;
causing, via the first mechanism and using the first signal, movement of at least a portion of the electronic display; and
causing, via the second mechanism and using the second signal, movement of a light guide disposed in front of the electronic display by adjusting an angle between the light guide and the electronic display,
wherein the light guide is disposed between a front light source and the electronic display,
wherein the movement of the light guide causes incoming light originating from the front light source to pass through the light guide and to exit the light guide at a first angle as first redirected light towards the electronic display, and
wherein the movement of at least the portion of the electronic display causes the first redirected light to reflect from the electronic display at a second angle as second redirected light to intersect the area where at least the portion of the head of the user is relative to the electronic display.
11. An electronic display assembly comprising:
an electronic display;
a light guide disposed in front of the electronic display;
a first mechanism coupled to at least a portion of the electronic display to cause movement of at least the portion of the electronic display;
a second mechanism coupled to the light guide to cause movement of the light guide by adjusting an angle between the light guide and the electronic display;
at least one light source disposed in front of the light guide to direct light through the light guide and towards the electronic display; and
a controller to:
receive position data representing an area where at least a portion of a head of a user is relative to the electronic display;
transmit, based at least in part on the position data, a first signal to the first mechanism; and
transmit, based at least in part on the position data, a second signal to the second mechanism;
the first mechanism configured to use the first signal to cause the movement of at least the portion of the electronic display,
the second mechanism configured to use the second signal to cause the movement of the light guide by adjusting the angle between the light guide and the electronic display,
wherein the movement of the light guide causes, during operation of the electronic display assembly, incoming light originating from the at least one light source to pass through the light guide and to exit the light guide at a first angle as first redirected light towards the electronic display, and
wherein the movement of at least the portion of the electronic display causes, during the operation of the electronic display assembly, the first redirected light to reflect from the electronic display at a second angle as second redirected light to intersect the area associated with the position data.
18. An electronic device comprising:
an electronic display;
a light guide disposed in front of the electronic display;
a first mechanism coupled to at least a portion of the electronic display to cause movement of at least the portion of the electronic display;
a second mechanism coupled to the light guide to cause movement of the light guide by adjusting an angle between the light guide and the electronic display;
at least one light source disposed in front of the light guide to direct light through the light guide and towards the electronic display;
memory; and
a processor coupled to the memory, the processor configured to execute instructions stored in the memory to cause the electronic device to:
receive position data representing an area where at least a portion of a head of a user is relative to the electronic display;
transmit, based at least in part on the position data, a first signal to the first mechanism; and
transmit, based at least in part on the position data, a second signal to the second mechanism;
the first mechanism configured to use the first signal to cause the movement of at least the portion of the electronic display,
the second mechanism configured to use the second signal to cause the movement of the light guide by adjusting the angle between the light guide and the electronic display,
wherein the movement of the light guide causes, during operation of the electronic device, incoming light originating from the at least one light source to pass through the light guide and to exit the light guide at a first angle as first redirected light towards the electronic display, and
wherein the movement of at least the portion of the electronic display causes, during the operation of the electronic device, the first redirected light to reflect from the electronic display at a second angle as second redirected light to intersect the area associated with the position data.
2. The method as recited in
3. The method as recited in
4. The method as recited in
5. The method as recited in
6. The method as recited in
7. The method as recited in
8. The method as recited in
9. The method as recited in
10. The method as recited in
12. The electronic display assembly as recited in
13. The electronic display assembly as recited in
14. The electronic display assembly as recited in
15. The electronic display assembly as recited in
16. The electronic display assembly as recited in
17. The electronic display assembly as recited in
19. The electronic device as recited in
20. The electronic device as recited in
|
Display technology continues to evolve to enable creation of displays that provide more vivid imagery, consume less power, are cheaper to manufacture, include few or no toxic materials, have smaller form factors, and so forth. In particular, performance of displays is often measured by contrast ratios, brightness metrics, resolution, and viewing angles. One driving point of this evolution is the popularity of display devices, which are common on almost all electronic devices. In addition, the size of the display devices has continued to grow for many product lines, such as televisions, which are now offered in larger sizes that were not available just a few years ago.
Specular reflective displays, which include transflective type displays, provide some improvements over exiting display technologies, such as an ability to display imagery using less power consumption than other display types. However, specular reflective displays exhibit relatively lower brightness metrics & contrast ratios compared to conventional emissive displays. These shortcomings are inherent in specular reflective displays because these displays rely on viewer position relative to the display and sometimes rely on external lighting conditions.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same reference numbers in different figures indicate similar or identical items.
This disclosure is directed to improving a user experience while viewing an electronic display, such as a specular reflective display. In general, the disclosure pertains to aligning the display or light reflected from the display relative to a position of a user (i.e., viewer), and thus enhancing an overall perceived brightness, contrast ratio, and viewing angle performance of the specular reflective display irrespective of illumination conditions (e.g., sunlight, airplane lighting, and overhead lamps).
In some embodiments, an electronic device may determine user position data for a position of a user with respect to a specular reflective display. For example, the electronic device may capture images of the user using one or more image sensors (e.g., cameras). The user position data may then be processed by the electronic device to generate signals which may modify display characteristics of the specular reflective display. The user position data may be used to track a head of the user, a gaze of the user, eyes of the user, and/or other aspects of the user while interacting with or otherwise viewing the specular reflective display.
In some embodiments, the electronic device and/or the specular reflective display may process the signals to determine an adjustment to an internal light source, such as a light emitting diode (LED) to change an angle or direction of light reflected into or from the specular reflective display. For example, the specular reflective display may include an array of lights which may be selectively turned on/off depending on a detected position of the user. The selected light(s) may cause emission of light in alignment or near alignment with the gaze of the user (i.e., line of sight of the user while viewing the display). Stated another way, the selection of the light(s) may cause emitted light to exit the specular reflective display in a direction to intersect the position where a face of the user is relative to the electronic device. In this example, the light source may be included in the specular reflective display, in the electronic device, and/or may be external to the electronic device.
In various embodiments, the electronic device and/or the specular reflective display may process the signals to determine an adjustment to redirect a direction of light from external sources (e.g., ambient light) and/or internal sources (e.g., LEDs) towards the specular reflective display by use of films, lens, light guides, light piping, and/or other static or movable structures. The redirected light may then be in alignment or near alignment with the gaze of the user.
In accordance with one or more embodiments, the electronic device and/or the specular reflective display may process the signals to determine an adjustment to rotate or otherwise move at least a portion of the specular reflective display (e.g., the entire display, sections of the display, pixels of the display) using electrical and/or mechanical devices (e.g., micro-electromechanical systems (MEMS)). These techniques may also be used to move a back reflector. The redirected light may then be in alignment or near alignment with the gaze of the user.
In some embodiments, the electronic device and/or the specular reflective display may process the signals to determine an adjustment to manipulate refractive index changing materials at a bottom (i.e., back layer) of a display stack for transflective displays. The redirected light may then be in alignment or near alignment with the gaze of the user.
In various embodiments, the relative position and/or orientation of a viewer of an electronic device can be determined using at least one image capture element of the device. For example, the feed from a video camera can be analyzed to locate a relative position of the viewer in the video feed, which can be analyzed to determine the relative direction of the viewer (e.g., what the user is looking at, etc.). In some embodiments, one or more digital still cameras can capture images periodically, in response to detected movement of the viewer and/or device, or at other appropriate times, which then can be analyzed to attempt to determine viewer position, as distance can often be determined in addition to direction when analyzing multiple sources of information from different locations. In some embodiments, infrared (IR) imaging can be used to detect specific features of the viewer, such as the viewer's eyes, for use in determining and/or tracking the location of the viewer. Changes in the orientation and/or position of the device can be determined using at least one motion sensor of the device, in order to provide for a higher sampling frequency than might otherwise be possible using the image information captured by the camera, or otherwise attempt to improve the relative position determinations.
In at least some embodiments, the electronic device can attempt to determine changes in the relative position, direction, and/or orientation between the viewer and device in order to update reflective characteristics of the specular reflective display. For example, the device can continue capturing and analyzing image information to attempt to determine changes in relative position of the viewer, such as may be based on movement of the viewer and/or the device. The device also can utilize information from at least one orientation or position determining element of the device, such as an accelerometer or inertial sensor, to assist in detecting motions of the device and updating the viewing angle accordingly. These elements also can detect changes in orientation of the device, such as through rotation of the device, even though the relative position between the viewer and the device might not have substantially changed.
The techniques and systems described herein may be implemented in a number of ways. Example implementations are provided below with reference to the following figures.
At 112, the electronic device 110 and/or the SR display 104 may detect a user position and/or viewing angle of the user from imagery captured by the image sensor(s) 110. Thus, the user's gaze 106 may be detected and/or approximated by detecting a position of the user relative to the electronic device 110 and/or relative to the SR display 104. In some embodiments, this process may use head detection algorithms, eye detection algorithms, and/or other image analysis to determine a position of the user and/or an approximate line of sight (the gaze 106) of the user toward the SR display 104. Additional details on these techniques are discussed below with reference to
At 114, the electronic device 110 and/or the SR display 104 may determine a reflection angle correction, θ, 116(1), that may improve or optimize the user's experience when viewing the SR display 104. The user's experience may be improved by enhancing an overall perceived brightness, contrast ratio, and viewing angle performance of the specular reflective display irrespective of illumination conditions (e.g., sunlight, airplane lighting, and overhead lamps). The reflection angle correction, θ, 116(1) may be an angle between a pre-adjustment direction of light reflection 118(1) from the SR display 104 and the direction of the gaze 106.
At 120, the electronic device 110 and/or the SR display 104 may adjust the SR display (including components of the display such as lights, reflective panels, films, electrical/mechanical devices, and/or other components) to create a resultant direction of light reflection 122 from the SR display 104. The resultant direction of light reflection 118(2) may reduce a resultant reflection angle correction, θ, 116(2), or possibly eliminate any correction angle (i.e., when the resultant direction of light reflection 118(2) is parallel to the gaze 106). As discussed above, a reduction in the resultant reflection angle correction, θ, 116(2) may result in an enhancement in overall perceived brightness, contrast ratio, and viewing angle performance of the specular reflective display irrespective of illumination conditions.
As illustrated, the devices 202 include various components 204. In some embodiments, the components 204 include computer-readable media 206 and one or more processors 208. The processors 208 interact with the computer-readable media 206 to execute instructions and facilitate operation of the device 202. The computer-readable media 206, meanwhile, may be used to store data 210, such as data files, audio and/or video media, electronic books (eBooks), or the like. In some embodiments, the data 210 may include information to cause adjustment of the SR display, as discussed below in greater detail. The computer-readable media 206 may also include software programs or other executable modules 212 that may be executed by the processors 208. Examples of such programs or modules include indexing modules for indexing data, reader programs, control modules (e.g., power management), network connection software, an operating system, sensor algorithms, and so forth.
The computer-readable media 206 may include volatile memory (such as RAM), nonvolatile memory, removable memory, and/or non-removable memory, implemented in any method or technology for storage of information, such as computer-readable instructions, data structures, program modules, or other data. Also, the processors 208 may include onboard memory in addition to or instead of the computer-readable media 206. Some examples of storage media that may be included in the computer-readable media 206 and/or processors 208 include, but are not limited to, random access memory (RAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), flash memory or other memory technology, compact disk (CD-ROM), digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by the devices 202. Any such computer-readable media may be part of the devices 202. In some embodiments, the computer-readable media 206 may be non-transitory computer readable media.
In accordance with one or more embodiments, the computer-readable media 206 may include a display optimizer 214 that may process inputs from at least some of the components 204, such as an image sensor 216, to determine adjustments to a specular reflective display 218, as discussed herein. For example, the display optimizer 214 may detect a position of a user of the electronic device 202, determine a reflection angle correction, and cause adjustment to the SR display 218 to reduce or eliminate the reflection angle correction (as described above in the process 100 shown in
The computer-readable media 206 may also store component drivers, such as a display driver 220, that include instructions that, when executed by the processors 208, are used to control the various components 204, such as an SR display 218. For example, the component drivers may be programs that can be used to control the operation, power consumption, and various operational states of each of the components 204. Typically, each component has its own corresponding component driver. Thus, in some embodiments, the display optimizer 216 may adjust the SR display 218 via the display driver 220.
The SR display 218 may include at least one of a light emitting controller 222, a light reflection controller 224, and/or an electrical/mechanical (E/M) controller 226, which may process signals from the display optimizer to cause adjustment of the SR display 218. The light emitting controller 222 may process the signals to determine an adjustment to an internal light source, such as a light emitting diode (LED) to change an angle or direction of light reflected into or from the specular reflective display. For example, the specular reflective display may include an array of lights which may be selectively turned on/off depending on a detected position of the user. The selected light(s) may cause emission of light in alignment or near alignment with the gaze of the user (i.e., line of sight of the user while viewing the display). Stated another way, the selection of the light(s) may cause emitted light to exit the specular reflective display in a direction to intersect the position where a face of the user is relative to the electronic device.
The light reflection controller 224 may process the signals to determine an adjustment to redirect a direction of light from external sources (e.g., ambient light) and/or internal sources (e.g., LEDs) towards the specular reflective display by use of films, lens, light guides, light piping, and/or other static or movable structures. The redirected light may then be in alignment or near alignment with the gaze of the user.
The E/M controller 226 may process the signals to determine an adjustment to rotate or otherwise move at least a portion of the specular reflective display (e.g., the entire display, sections of the display, pixels of the display) using electrical and/or mechanical devices (e.g., micro-electromechanical systems (MEMS)). These techniques may also be used to adjust a position of a back reflector. In some embodiments, the electronic device and/or the specular reflective display may process the signals to determine an adjustment to manipulate refractive index changing materials at a bottom (i.e., back layer) of a display stack for transflective displays. The redirected light may then be in alignment or near alignment with the gaze of the user.
Various processes, instructions, methods and techniques described herein may be considered in the general context of computer-executable instructions, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. for performing particular tasks or implementing particular abstract data types. These program modules can be implemented as software modules that execute on the processors 208, as hardware, and/or as firmware. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments. An implementation of these modules and techniques may be stored on or transmitted across some form of computer-readable media.
At 302, the display optimizer 214 may detect a user position and/or viewing angle. In some embodiments, the display optimizer 214 may determine, based on output data from the at least one image sensor, position data representing a position where at least a portion (e.g., face, head, body, eyes) of a user is relative to the electronic device.
At 304, the display optimizer 214 may determine an angle of a light source. For example, the display optimizer 214 may determine a known or controlled direction of light emitted from the display (e.g., via backlight, a front light, etc.). The display optimizer 214 may detect a direction of light using the image sensor 216. The angle of the light correlates to the pre-adjustment direction of light reflection (labeled 118(1) in
At 306, the display optimizer 214 may determine a correction angle of light reflection. The correction angle, when compensated for as discussed below, may cause light output from the display to align or nearly align with a gaze of the user, thereby enhancing an overall perceived brightness, contrast ratio, and viewing angle performance of the specular reflective display irrespective of illumination conditions. The display optimizer 214 may determine a correction angle of light reflection based at least partly on the outputs of the operations 302 and 304. In some embodiments, the display optimizer 214 may determine a change in a direction of light traveling through a portion of the electronic display based on the position data, wherein the change in direction causes the light to exit the electronic display in a direction to intersect the position where the at least the portion of the user is relative to the electronic display.
At 308, the display optimizer 214 may utilize one or more of the light emitting controller 222, the light reflection controller 224, and/or the E/M controller 226 to cause a reduction of the correction angle. Thus, the display optimizer may utilize a single one of the controllers, multiple controllers (if present), and/or all of the controllers depending on the configuration of the SR display 218 and/or type of adjustment desired.
At 310, following route “A”, the light emitting controller 222 may modify an internal light source angle by determining an adjustment to an internal light source, such as a light emitting diode (LED) to change an angle or direction of light reflected into or from the specular reflective display. For example, the specular reflective display may include an array of lights which may be selectively turned on/off depending on a detected position of the user. The selected light(s) may cause emission of light in alignment or near alignment with the gaze of the user (i.e., line of sight of the user while viewing the display).
At 312, following route “B”, the light reflection controller 224 may modify an ambient light angle and/or reflected light angle, such as by determining an adjustment to redirect a direction of light from external sources (e.g., ambient light) and/or internal sources (e.g., LEDs) towards the specular reflective display by use of films, lens, light guides, light piping, and/or other static or movable structures.
At 314, following route “C”, the E/M controller 226 may modify a display element angle, such as by determining an adjustment to rotate or otherwise move at least a portion of the specular reflective display (e.g., the entire display, sections of the display, pixels of the display) using electrical/mechanical devices (e.g., micro-electromechanical systems (MEMS)). These techniques may also be used to adjust a position of a back reflector.
Following implementation of the operation 310, 312, and/or 314, the process 300 may advance to an operation 316. At 316, the display optimizer 214 may apply the correction type(s) from the operations 310, 312, and/or 314 to reduce (or possibly eliminate) the correction angle determined at the operation 306. For example, multiple processes may occur in parallel to change a direction of light (e.g., multiple processes from a single operation, a process from each of two different operations, and so forth). The process 300 may then continue via a loop to the operation 302 to continually adjust the display to improve and/or optimize a user's viewing experience as discussed herein, such as to make further adjustments when the user moves relative to the display.
The angle/orientation of the moveable reflective panels 404 may be controlled by the E/M controller 226 and/or the light reflection controller 224, which may cause movement of the panels 404 via moveable arms 410 which may correspond to each panel (e.g., 410(1)-(N)). The moveable arms 410 may move, pivot, change angle, and/or translate with respect to a structure 412. In some embodiments, the moveable arms 410 may include biasing devices that cause movement of the panels 404. The moveable arms 410 may enable rotation, pivoting, lateral movement, extension, and/or other movements of respective panels 404, which in turn may cause the light (e.g., the initial light 401) to reflect toward the SR display 218 at different determined angles. By changing the angle/orientation of individual panels 404, the resulting angle of the second reflected light 406 may be changed, and thus perceivable by the user. The orientation/angle of the panels 404 may be adjusted based on the determined user position 408, as discussed throughout this disclosure. Although
In various embodiments, the array of lights 502 may include at least a first plurality of lights and a second plurality of lights, which may be arranged in an alternating layout. For example, the first plurality of lights may include the light 502(1) and 502(3) while the second plurality of lights may include the light 502(2) and 502(M). The first plurality of lights may be controlled as a group, and thus turned on, dimmed, intensified, and/or turned off together. Similarly, the second plurality of lights may be controlled as a group, and thus turned on, dimmed, intensified, and/or turned off together.
In some embodiments, light piping may be used to change the direction of the light directed toward the fixed reflective structure 500 or may be used as the fixed reflective structure 500 to direct light from activated lights (of the light array 502) toward the SR display 218 at a determined angle. For example, different light piping may be associated with different lights or groups of lights. When a first set of lights is activated, a first associated set of light pipes may direct the light toward the SR display at first angles. When a second different set of lights is activated, a second different set of light pipes may direct the light toward the SR display at second different angles.
The E/M controller 226 may control the movement of the SR display or portion of the SR display, such as by apply a current which attracts or repels the SR display 218 toward or away from one of the surfaces 1212. Other mechanisms may cause the display to bend or deform, such as mechanical devices including micro-actuators. The movement of the SR display 218 using this approach may only enable small changes in angle. Thus, this approach may be used with other approaches discussed here to “fine tune” an angle of the redirected light. This approach may also be effective when ambient light is a primary source of light.
Various other algorithms can be used to determine the location of features on a user's face. For example,
In a basic configuration, a single camera may be used to capture imagery of a user and locate an area of a portion of the user's face, via X and Y axes (i.e., in two-dimensional space), in the captured imagery. This two-dimensional location of the portion of the user's face may suffice as input data to use to adjust display characteristics as discussed above. As an example, the imagery captured by the single camera may be analyzed to distinguish moving portions of imagery between multiple images (which may be inferred to be background) when a face is located generally in a same position (e.g., when a user is walking with a handheld device held in a relatively consistent location relative to the user's head). The opposite situation may also be used where background does not move and the face does move across multiple images. The face may then be identified as being different than the background.
Once the positions of facial features of a user are identified, relative motion between the user and the device can be detected and utilized as input. For example,
In at least some embodiments, a computing device can utilize one or more cameras or other such sensors to determine the relative direction of the user. For example,
Software executing on the computing device (or otherwise in communication with the computing device) can obtain information such as the angular field of view of the camera, the zoom level at which the information is currently being captured, and any other such relevant information, which can enable the software to determine an approximate direction 1510 of at least one of the user's eyes with respect to the camera. In some embodiments, methods such as ultrasonic detection, feature size analysis, luminance analysis through active illumination, or other such distance measurement approaches can be used to assist with position determination. In other embodiments, a second camera can be used to enable distance determinations through stereoscopic imaging. Once the direction vectors from at least two image capture elements are determined for a given feature, the intersection point of those vectors can be determined, which corresponds to the approximate relative position in three dimensions of the respective feature as known for disparity mapping and other such processes.
Further illustrating such an example approach,
When using a camera to track location, however, the accuracy is limited at least in part by the frame rate of the camera. Further, images take some time to process such that there can be some lag in the determinations. As changes in orientation of the device can occur relatively quickly, it can be desirable in at least some embodiments to enhance the accuracy of the point of view determinations. In some embodiments, a sensor or other such element of a computing device can be used to determine motions of the computing device, which can help adjust point of view determinations. The sensors can be any appropriate sensors capable of providing information about rotations and/or translations of the device, as may include accelerometers, inertial sensors, electronic gyroscopes, electronic compasses, and the like.
For example,
A first frame of reference 1606 or orientation can be determined at or near the time of capture of a first image by a camera 1610 of the computing device 1602. In some embodiments, the determination can be triggered by receiving input to capture an image or another such action, but in other embodiments the frame of reference and/or orientation information can be updated periodically, such as several times a second based upon the type and/or configuration of the electronic gyroscope. The gyroscope can also be any appropriate electronic gyroscope component, such as a conventional MEMS gyroscope used in various consumer devices. Approaches for implementing and obtaining orientation changes from such a gyroscope are well known in the art and, as such, will not be discussed in detail herein.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as illustrative forms of implementing the claims.
Patent | Priority | Assignee | Title |
11029759, | Sep 19 2018 | International Business Machines Corporation | Haptic movable display for spatial correlation |
11475805, | Sep 27 2018 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Switchable displays with movable pixel units |
Patent | Priority | Assignee | Title |
8225229, | Nov 09 2006 | Sony Corporation | Adjusting display brightness and/or refresh rates based on eye tracking |
8340365, | Nov 20 2006 | Sony Corporation | Using image recognition for controlling display lighting |
8368663, | Nov 27 2006 | Microsoft Technology Licensing, LLC | Touch sensing using shadow and reflective modes |
20050264502, | |||
20090313584, | |||
20120075166, | |||
20130100096, | |||
20130300637, | |||
20140092472, | |||
20140118407, | |||
20140247211, | |||
20150170608, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 02 2015 | Amazon Technologies, Inc. | (assignment on the face of the patent) | / | |||
Jul 14 2015 | SHAH, SUCHIT SHREYAS | Amazon Technologies, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037194 | /0804 |
Date | Maintenance Fee Events |
Oct 11 2021 | REM: Maintenance Fee Reminder Mailed. |
Mar 28 2022 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Feb 20 2021 | 4 years fee payment window open |
Aug 20 2021 | 6 months grace period start (w surcharge) |
Feb 20 2022 | patent expiry (for year 4) |
Feb 20 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 20 2025 | 8 years fee payment window open |
Aug 20 2025 | 6 months grace period start (w surcharge) |
Feb 20 2026 | patent expiry (for year 8) |
Feb 20 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 20 2029 | 12 years fee payment window open |
Aug 20 2029 | 6 months grace period start (w surcharge) |
Feb 20 2030 | patent expiry (for year 12) |
Feb 20 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |