Various embodiments of TOF depth cameras and methods for illuminating image environments with illumination light are provided herein. In one example, a TOF depth camera configured to collect image data from an image environment illuminated by illumination light includes a light source including a plurality of surface-emitting lasers configured to generate coherent light. The example TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment.
|
14. A time-of-flight depth camera configured to collect image data from an image environment illuminated by illumination light, the time-of-flight depth camera comprising:
a light source comprising a plurality of surface-emitting lasers configured to generate coherent light;
a light guide positioned to receive at least a portion of the coherent light from the light source, the light guide being configured to spread the coherent light within the light guide and to emit the portion of the coherent light;
a microlens array positioned to receive at least a portion of the light emitted from the light guide, the microlens array adapted to diverge the light received from the light guide for projection into the image environment as illumination light; and
an image sensor configured to detect at least a portion of return light reflected from the environment.
1. A time-of-flight depth camera configured to collect image data from an image environment illuminated by illumination light, the time-of-flight depth camera comprising:
a light source including an array of surface-emitting lasers configured to generate coherent light;
a homogenizing light guide positioned to receive at least a portion of coherent light from the light source, the homogenizing light guide being configured to increase an apparent size of the light source;
a microlens array positioned to receive at least a portion of the light emitted from the homogenizing light guide, the microlens array adapted to diverge the light received from the homogenizing light guide for projection into the image environment as illumination light; and
an image sensor configured to detect at least a portion of return light reflected from the image environment.
9. A peripheral time-of-flight depth camera system configured to collect image data from an image environment illuminated by illumination light, the peripheral time-of-flight depth camera comprising:
a light source including a plurality of surface-emitting lasers configured to generate coherent light;
a reflective light guide including a folded light path that receives at least a portion of the coherent light from the light source and emits all of the portion of the coherent light received, the reflective light guide configured to self-correct one or more reflection errors via total internal reflection;
a microlens array positioned to receive at least a portion of the light emitted from the reflective light guide, the microlens array adapted to diverge the light received from the reflective light guide for projection into the image environment as illumination light;
an image sensor configured to detect at least a portion of return light reflected from the image environment;
a logic subsystem; and
a storage subsystem holding instructions executable by the logic subsystem to generate depth information about the object based upon image information generated by the image sensor from detected return light and to output the depth information to a computing device.
2. The time-of-flight depth camera of
3. The time-of-flight depth camera of
4. The time-of-flight depth camera of
5. The time-of-flight depth camera of
6. The time-of-flight depth camera of
7. The time-of-flight depth camera of
8. The time-of-flight depth camera of
10. The peripheral time-of-flight depth camera system of
11. The peripheral time-of-flight depth camera system of
a light entrance configured to receive at least a portion of coherent light from the light source;
a first total internal reflection surface configured to receive all of the coherent light from the light entrance;
a second total internal reflection surface configured to receive all of the coherent light reflected from the first total internal reflection surface; and
a light exit configured to emit all of the coherent light reflected from the second total internal reflection surface.
12. The peripheral time-of-flight depth camera system of
13. The peripheral time-of-flight depth camera system of
15. The time-of-flight depth camera of
16. The time-of-flight depth camera of
17. The time-of-flight depth camera of
18. The time-of-flight depth camera of
19. The time-of-flight depth camera of
20. The time-of-flight depth camera of
|
This application is a divisional of U.S. application Ser. No. 13/585,638, filed Aug. 14, 2012 and entitled “ILLUMINATION LIGHT PROJECTION FOR A DEPTH CAMERA”, the entirety of which is hereby incorporated by reference for all purposes.
In a time-of-flight (TOF) depth camera, light pulses are projected from a light source to an object in an image environment that is focused onto an image sensor. It can be difficult to fill the image environment with illumination light, as the image environment may have a sizeable volume and may have a cross-sectional shape (e.g. rectangular) that can be difficult to achieve with a desired intensity profile. Further, the imaging optics may have a large depth of field in which a consistent projected light intensity is desired.
Some previous approaches to filling image environments with light use high-order optics to shape diverging light emitted from side-emitting light sources. However, such approaches typically require precise design and manufacturing control of the angular distribution of the light in order to fill the image environment.
Various embodiments related to illuminating image environments with illumination light for a TOF depth camera are provided herein. For example, one embodiment provides a TOF depth camera configured to collect image data from an image environment illuminated by illumination light is provided. The TOF camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light. The TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
As mentioned above, a TOF depth camera utilizes light pulses (e.g. infrared and/or visible light) projected from the TOF depth camera into an image environment. The illumination light pulses reflect from the various surfaces of objects in the image environment and are returned to an image sensor. The TOF depth camera generates distance data by quantifying time-dependent return light information. In other words, because light is detected sooner when reflected from a feature nearer to the photosensitive surface than from an object feature farther away, the TOF depth camera can determine distance information about the object's features.
It may be difficult to fill the image environment with illumination light of a desired intensity profile. For example, it may be desirable for the intensity of the project light to be somewhat greater in a region near a periphery of the image environment than in a center of the imaging environment, as light reflected from those regions may have a lower intensity at the image sensor due to the angle of incidence on the imaging optics.
Further, as mentioned above, the imaging environment may have a different cross-sectional shape than light emitted by the light source. The imaging environment also may be relatively large to capture potentially large ranges of movements of potentially multiple users.
Illumination sources used with TOF depth cameras may emit light in circular patterns or circularly-shaped emission envelopes. Therefore, overlaying a circularly-shaped emission pattern onto a non-circular image environment in a manner that achieves a relatively uniform illumination intensity across the entire non-circular image environment may result in the illumination of portions of the environment that are not used for depth analysis. This may waste light source power, and also may involve the use of a more powerful and expensive light source.
Some previous approaches to reshaping illumination light employ random distributions of spherical microlenses. By randomly distributing the microlenses, the shape of the emitted light may be adjusted while avoiding the introduction of diffractive interference that may result from a periodic arrangement of microlenses. However, because the microlenses are randomly sized, the ability to control the distribution of light within the image environment, including the light's cross-sectional profile and the dimensions of the envelope that it illuminates within the room, may be compromised.
Accordingly, various embodiments of TOF depth cameras and methods for illuminating image environments with illumination light are provided herein. For example, in some embodiments, a TOF depth camera includes a light source including a plurality of surface-emitting lasers configured to generate coherent light. The example TOF camera also includes an optical assembly configured to transmit light from the plurality of surface-emitting lasers to the image environment and an image sensor configured to detect at least a portion of return light reflected from the image environment. The plurality of surface-emitting lasers may be arranged in a desired illumination light shape, thereby allowing an image of the shape of the light source to be relayed into the image environment. In other embodiments, a homogenizing light guide may be configured to provide a shaped light source for such use.
While the example shown in
TOF depth camera 100 also includes an image sensor 110 configured to detect at least a portion of return light 112 reflected from image environment 106. Image sensor 110 includes a detector 114 for collecting return light 112 for use in generating depth information (such as a depth map) for the scene.
In the embodiment shown in
Surface-emitting lasers 202 may be fabricated on a suitable substrate (e.g., GaAs) using large-scale integration techniques (e.g., film deposition and film patterning techniques). In some examples, a die comprising a laser array 200 may include hundreds or more of surface-emitting lasers 202. For example, a 1.5 mm square die including surface-emitting lasers 202 that have a center-to-center pitch of approximately 44 μm may include up to 1156 surface-emitting lasers 202.
Another embodiment of a surface-emitting laser 202 is shown in
Turning back to
The illumination envelope region refers to a cross-sectional area that is lit with illumination light 108. In the embodiment shown in
As mentioned above, in some embodiments, the lasers included in light source 118 may be arranged in a shape that matches that of a desired emission envelope (e.g., a shape or pattern of light projected by the lasers), and optical assembly 120 may be configured to transmit or relay that shape to the far field. In such embodiments, the emission envelope and illumination envelope region 128 may take the shape of the arrangement of the lasers. Thus, as one specific example, a rectangularly-shaped array of surface-emitting lasers may be used to generate a rectangularly-shaped light envelope in the far field. In other embodiments, optical assembly 120 may be configured re-shape the emission envelope. For example, light emitted from square arrangement of surface-emitting lasers may be reshaped into a rectangularly-shaped light envelope in the far field.
Further, in some embodiments, optical assembly 120 may shape the cross-sectional light intensity/irradiance profile of illumination light 108 from a Gaussian profile into a differently-shaped illumination profile. For example, in some embodiments, illumination light 108 may be shaped into an illumination profile exhibiting a flat-topped, mesa-like shape that is symmetrically oriented around an optical axis of illumination light 108. In such embodiments, the irradiance of illumination light 108 may have a constant intensity, within an acceptable tolerance, in a region near the optical axis (e.g., a region corresponding to a top of the mesa). The irradiance may then decrease in intensity in region farther from the optical axis (e.g., a region corresponding to sidewalls of the mesa).
In some other embodiments, illumination light 108 may be characterized by a cross-sectional light profile that is more intense farther from an optical axis of illumination light 108 than closer to an optical axis of the illumination light.
Without wishing to be bound by theory, generating an “M”-shaped profile for the illumination light may offset a “W”-shaped intensity profile received at image sensor 110 due to reflection effects caused by objects in the image environment. In other words, the net effect of supplying light with an “M”-shaped profile to image environment 106 may be that image sensor 110 detects return light having a mesa-shaped profile.
Lens system 600 may utilize a high f-number aperture stop 610 to achieve a desired depth of field for the relayed image source light in the illumination depth region 122. In some non-limiting embodiments, f-numbers in a range of f/250 to f/1000 may be used to provide an illumination depth region having a depth of field in a corresponding range of 500 to 3500 mm.
Condenser lens stage 602 is positioned within lens system 600 to receive light from light source 118, condensing divergent rays of the emitted light and forming aperture stop 610. In some embodiments, condenser lens stage 602 may be configured to condense the light received without magnifying or demagnifying the light beyond an acceptable tolerance. Additionally or alternatively, in some embodiments, condenser lens stage 602 may be configured to impart or shape the light received into a selected light illumination profile. For example, condenser lens stage 602 may distort light received from light source 118 to generate the “M”-shaped profile described above, or any other suitable cross-sectional illumination profile.
Relay lens stage 604 is positioned to receive light from condenser lens stage 602 and relay an image of light source 118 into illumination depth region 122. Stated differently, relay lens stage 604 provides the power within lens system 600 to transmit the image of light source 118 into image environment 106, forming and lighting illumination envelope region 128.
In some embodiments, an optional Schmidt plate 606 may be included within lens system 600, positioned at an entrance pupil 612 of lens system 600. Schmidt plate 606 may be used to introduce aberrations to illumination light to reduce the intensity of diffraction artifacts that may be introduced by surface-emitting lasers 202. Further, Schmidt plate 606 may help to achieve a desired light illumination profile. For example, including Schmidt plate 606 may emphasize peaks and valleys within an “M”-shaped illumination profile imparted by condenser lens stage 602. As the defocusing effect of Schmidt plate 606 may impact the collimating effect of condenser lens stage 602, potentially reducing depth of illumination depth region 122, inclusion of Schmidt plate 606 may be accompanied by a compensatory adjustment to the f-number of lens system 600.
While lens system 600 depicts classical lenses for clarity, it will be appreciated that any suitable embodiment of the lens stages described above may be included within lens system 600 without departing from the scope of the present disclosure. For example, in some embodiments, wafer-level optics may be employed for one or more of the lens stages. As used herein, a wafer optic structure refers to an optical structure formed using suitable formation and/or patterning processes like those used in semiconductor patterning. Wafer-level optics may offer the potential advantage of cost-effective miniaturization of one or more of the lens stages and/or enhance manufacturing tolerances for such stages.
While lower levels of collimation may spread illumination light 108 over a greater area, that spreading be accompanied by a reduction in illumination depth region 122. Accordingly, in some embodiments, a lens system may be formed using diffractive optics. If diffractive optical elements are employed for one or more of the lens elements/stages included in the lens system, a diffractive optic substrate will have a prescription for those stages encoded on a respective surface of the substrate. In some embodiments, for example, a single substrate may have a light receiving surface that encodes a prescription for one lens stage and a light emitting surface that encodes a prescription for another lens stage. Because the working surface of a diffractive optic is comparatively thinner than a classical lens analog, which may have a thickness set by a radius of curvature for the classical lens, the diffractive optic may offer similar potential miniaturization enhancements to wafer optics, but may also preserve collimation and depth of field. Moreover, in some embodiments, diffractive optics may permit one or more optical elements to be removed.
It will be appreciated that the relative positions of the optical stages described above may be varied in any suitable manner without departing from the scope of the present disclosure. For example, in some embodiments, one or more of the optical stages may be varied to increase the apparent size of light source 118. Increasing the size of light source 118 may reduce a user's ability to focus on the light source (e.g., by making the light source appear more diffuse) and/or may avoid directly imaging light source 118 on a user's retina. As a non-limiting example, some systems may be configured so that an image of light source 118 may not be focused on a user's retina when the user's retina is positioned within 100 mm of light source 118.
In some embodiments, increasing the apparent source size may include positioning relay lens stage 604 closer to light source 118, which may cause illumination light 108 to diverge faster, depending upon the configuration of the relay lens stage 604 and light source 118. Because this adjustment may also lead to an increase in the field of view and a decrease in illumination depth region 122, a prescription and/or position for condenser lens stage 602 may also be adjusted to adjust the focal length of optical assembly 120 while the arrangement and pitch of surface-emitting lasers 202 included within light source 118 may be varied to adjust illumination envelope region 128. In some embodiments, optical assembly 120 may also be configured to transform the emission envelope into a different shape while relaying the light to image environment 106.
Homogenizing light guide 902 takes the form of an optical wedge, though it will be appreciated that any suitable light guide configured to spread and smooth light may be employed without departing from the present disclosure. In the embodiment shown in
Light passing along homogenizing light guide 902 may travel in a collimated or near-collimated path to light emission surface 910. In some non-limiting examples, light may fan out by 9 degrees or less while traveling between light receiving surface 904 and light emission surface 910. However, light from light source 118 may blend and mingle while traveling through homogenizing light guide 902, so that the light emitted at light emission surface 910 causes the plurality of lasers to appear as a single, larger source located at light emission surface 910.
After emission from light emission surface 910, the light is received by a microlens array 912 and spread to fill illumination envelope region 128. Microlens array 912 includes a plurality of small lens elements configured to diverge the light and projected it into image environment 106. For example,
Each of the lens elements 1002 included in microlens array 912 is configured to create the desired angular field of illumination for optical assembly 120. Put another way, each lens element 1002 is configured to impart a selected angular divergence to incoming light. As used herein, divergent light refers to coherent light that is spread from a more collimated beam into a less collimated beam. Divergent light may have any suitable illumination intensity cross-section, as explained in more detail below, and may have any suitable divergence angle, as measured between an optical axis and an extreme ray of the divergent light. The divergence angle may adjusted by adjusting the pitch of the lens elements 1002 within microlens array 912. By spreading the incoming light, microlens array 912 transmits light to all regions within illumination envelope region 128.
In some embodiments, the degree of divergence that may be realized by lens elements 1002 may be affected by the refractive index of the material from which the lenses are formed. As the lens curvature increases, the light approaches a total internal reflection limit. However, by increasing the index of refraction, a selected divergence angle may be achieved with comparatively less light bending. For example, in some embodiments, lens elements 1002 may be made from optical grade poly(methyl methacrylate) (PMMA), which has a refractive index of approximately 1.49. In other embodiments, lens elements 1002 may be made from optical grade polycarbonate (PC), having a refractive index of approximately 1.6. Lens elements 1002 made from PC may have less curvature to obtain the same divergence angle compared to elements made from PMMA. It will be appreciated that any suitable optical grade material may be used to make lens elements 1002, including the polymers described above, optical grade glasses, etc.
While the embodiment of microlens array 912 shown in
The aggregate effect of spreading the coherent light at each lens element 1002 may be to shape the cross-sectional light intensity/irradiance profile from a Gaussian profile associated with incident coherent light into a differently-shaped illumination profile. For example, in some embodiments, as few as six lens elements 1002 may be sufficient to form a desired illumination profile such as the “M”-shaped illumination profile described above.
Yet another approach to reshaping the emission envelope and increasing the apparent source size includes the use of a folded optical path within optical assembly 120.
In the example shown in
At 1402, method 1400 includes generating coherent light using a plurality of surface-emitting lasers. For example, coherent visible, infrared, or near-infrared light may be generated using suitable surface-emitting lasers like the VCSELs and/or VECSELs described herein.
In some embodiments, method 1400 may include homogenizing the coherent light at 1404. Homogenizing the coherent light may increase the apparent size of the light source and/or may cause the plurality of surface-emitting lasers to appear as a single source. In some of such embodiments, homogenizing the coherent light at 1404 may include, at 1406, homogenizing the illumination light using a homogenizing light guide. Non-limiting examples of homogenizing light guides include homogenizing light wedges and homogenizing light slabs configured to emit light along one surface via partial reflection of the light while totally reflecting light from another surface within the light guide. In other embodiments, homogenizing the coherent light at 1404 may include, at 1408, homogenizing the illumination light using a reflective light guide. Non-limiting examples of reflective light guides include guides that define folded light paths. In yet other embodiments, such homogenization may be omitted.
At 1410, method 1400 includes relaying the illumination light to the image environment. In some embodiments, relaying the illumination light to the image environment may include, at 1412, relaying an image of the light source to the image environment via a lens system In some of such embodiments, the apparent size of the image source may be adjusted by adjusting the focal length, illumination depth region, and illumination envelope region of the lens system.
In some embodiments, relaying the illumination light to the image environment at 1410 may include, at 1414, relaying collimated light to the image environment. For example, as described above, light from each laser of an array of surface-emitting laser may be collimated, and then directed in a different direction than collimated light from other lasers in the array. As another example, a microlens array may be used to relay the light received from a suitable homogenizing light guide to different portions of the illumination envelope region.
In some embodiments, the methods and processes described above may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
TOF depth camera 100 shown in
TOF depth camera 100 includes a logic subsystem 160 and a storage subsystem 162. TOF depth camera 100 may optionally include a display subsystem 164, input/output-device subsystem 166, and/or other components not shown in
Logic subsystem 160 includes one or more physical devices configured to execute instructions. For example, logic subsystem 160 may be configured to execute instructions that are part of one or more applications, services, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, or otherwise arrive at a desired result.
Logic subsystem 160 may include one or more processors configured to execute software instructions. Additionally or alternatively, logic subsystem 160 may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The processors of logic subsystem 160 may be single-core or multi-core, and the programs executed thereon may be configured for sequential, parallel or distributed processing. Logic subsystem 160 may optionally include individual components that are distributed among two or more devices, which can be remotely located and/or configured for coordinated processing. Aspects of the logic subsystem may be virtualized and executed by remotely accessible networked computing devices configured in a cloud-computing configuration.
Storage subsystem 162 includes one or more physical, non-transitory, devices configured to hold data and/or instructions executable by logic subsystem 160 to implement the herein-described methods and processes. When such methods and processes are implemented, the state of storage subsystem 162 may be transformed—e.g., to hold different data.
Storage subsystem 162 may include removable media and/or built-in devices. Storage subsystem 162 may include optical memory devices (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory devices (e.g., RAM, EPROM, EEPROM, etc.) and/or magnetic memory devices (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), among others. Storage subsystem 162 may include volatile, nonvolatile, dynamic, static, read/write, read-only, random-access, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. In some embodiments, logic subsystem 160 and storage subsystem 162 may be integrated into one or more unitary devices, such as an application-specific integrated circuit (ASIC), or a system-on-a-chip.
It will be appreciated that storage subsystem 162 includes one or more physical, non-transitory devices. However, in some embodiments, aspects of the instructions described herein may be propagated in a transitory fashion by a pure signal (e.g., an electromagnetic signal, an optical signal, etc.) that is not held by a physical device for a finite duration. Furthermore, data and/or other forms of information pertaining to the present disclosure may be propagated by a pure signal.
The terms “module” and “program” may be used to describe an aspect of the computing system implemented to perform a particular function. In some cases, a module or program may be instantiated via logic subsystem 160 executing instructions held by storage subsystem 162. It will be understood that different modules and/or programs may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, and/or program may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module” and “program” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
When included, display subsystem 164 may be used to present a visual representation of data held by storage subsystem 162. This visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the storage subsystem, and thus transform the state of the storage subsystem, the state of display subsystem 164 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 164 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic subsystem 160 and/or storage subsystem 162 in a shared enclosure, or such display devices may be peripheral display devices.
When included, input/output-device subsystem 166 may be configured to communicatively couple the computing system with one or more other computing devices. Input/output-device subsystem 166 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, input/output-device subsystem 166 may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network. In some embodiments, input/output-device subsystem 166 may allow the computing system to send and/or receive messages to and/or from other devices via a network such as the Internet. Input/output-device subsystem 166 may also optionally include or interface with one or more user-input devices such as a keyboard, mouse, game controller, camera, microphone, and/or touch screen, for example.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Hudman, Joshua Mark, Masalkar, Prafulla
Patent | Priority | Assignee | Title |
11596311, | Dec 31 2012 | Omni Medsci, Inc. | Remote sensing and measurement system using time-of-flight detectors |
11609313, | Jul 31 2018 | Waymo LLC | Hybrid time-of-flight and imager module |
11678805, | Dec 31 2012 | Omni Medsci, Inc. | Active remote sensing system using time-of-flight sensor combined with cameras and wearable devices |
11896346, | Dec 31 2012 | OMNI MEDSCI, LLC | Short-wave infrared sensor for identifying based on water content |
Patent | Priority | Assignee | Title |
5013133, | Oct 31 1988 | The University of Rochester | Diffractive optical imaging lens systems |
6870650, | Aug 01 2000 | FLIR Systems Trading Belgium BVBA | Illumination device and method for laser projector |
7436494, | Mar 22 2004 | AI-CORE TECHNOLOGIES, LLC | Three-dimensional ladar module with alignment reference insert circuitry |
8803967, | Jul 31 2009 | AMS SENSORS SINGAPORE PTE LTD | Time of flight camera with rectangular field of illumination |
8854426, | Nov 07 2011 | Microsoft Technology Licensing, LLC | Time-of-flight camera with guided light |
20010022566, | |||
20020075460, | |||
20030053513, | |||
20040037450, | |||
20060023173, | |||
20070177270, | |||
20080084542, | |||
20080278460, | |||
20100277699, | |||
20110176146, | |||
20120051588, | |||
20120082346, | |||
20120154542, | |||
CN101009749, | |||
CN101681056, | |||
CN102129152, | |||
CN102402003, | |||
CN1758083, | |||
CN201293863, | |||
CN201917706, | |||
EP2442134, | |||
WO2010104692, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 13 2012 | HUDMAN, JOSHUA MARK | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037990 | /0677 | |
Aug 13 2012 | MASALKAR, PRAFULLA | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037990 | /0677 | |
Oct 14 2014 | Microsoft Corporation | Microsoft Technology Licensing, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037990 | /0710 | |
Mar 11 2016 | Microsoft Technology Licensing, LLC | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 23 2021 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 13 2021 | 4 years fee payment window open |
Aug 13 2021 | 6 months grace period start (w surcharge) |
Feb 13 2022 | patent expiry (for year 4) |
Feb 13 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 13 2025 | 8 years fee payment window open |
Aug 13 2025 | 6 months grace period start (w surcharge) |
Feb 13 2026 | patent expiry (for year 8) |
Feb 13 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 13 2029 | 12 years fee payment window open |
Aug 13 2029 | 6 months grace period start (w surcharge) |
Feb 13 2030 | patent expiry (for year 12) |
Feb 13 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |