This disclosure describes techniques for receiving information that is wirelessly transmitted to a mobile computing system by wireless devices that are proximal to a route being travelled by the mobile computing system, and presenting at least a portion of the received information through a display of the mobile computing system. The information can be displayed according to a computing experience that is determined for a user of the mobile computing system (e.g., user-selected, inferred based on a stored schedule of the user, and so forth). Different sets of location-based information can be transmitted to the mobile computing system from different wireless devices that the mobile computing system comes into proximity with while traveling along a route. In some instances, the information can be locally stored on the wireless device(s) to reduce latency.

Patent
   11856479
Priority
Jul 03 2018
Filed
Aug 25 2022
Issued
Dec 26 2023
Expiry
Jul 03 2039
Assg.orig
Entity
Large
0
594
currently ok
1. A method performed by a mobile computing system, the method comprising:
determining a computing experience for a user of the mobile computing system, wherein determining the computing experience includes:
accessing at least one data source that stores a schedule of the user; and
inferring the computing experience based at least partly on the schedule of the user;
establishing a wireless connection between the mobile computing system and each of a plurality of wireless devices, wherein each respective wireless device is in proximity to a travel route of the mobile computing system;
scanning a visible marker that is proximal to a location along the travel route;
receiving, over each respective wireless connection between the mobile computing system and a respective wireless device, location-based information that is associated with a location of the respective wireless device in proximity to the travel route, wherein at least a portion of the location-based information that is associated with the location of the respective wireless device is communicated to the mobile computing system responsive to the mobile computing system scanning the visible marker that is proximal to the location; and
presenting, through a display of the mobile computing system, the location-based information received from each of the plurality of wireless devices, wherein the location-based information is presented according to the determined computing experience.
10. A mobile computing system comprising:
a display;
at least one processor communicatively coupled to the display; and
memory communicatively coupled to the at least one processor, the memory storing instructions which, when executed by the at least one processor, cause the at least one processor to perform operations comprising:
determining a computing experience for a user of the mobile computing system, wherein determining the computing experience includes:
accessing at least one data source that stores a schedule of the user; and
inferring the computing experience based at least partly on the schedule of the user;
establishing a wireless connection between the mobile computing system and each of a plurality of wireless devices, wherein each respective wireless device is in proximity to a travel route of the mobile computing system;
scanning a visible marker that is proximal to a location along the travel route;
receiving, over each respective wireless connection between the mobile computing system and a respective wireless device, location-based information that is associated with a location of the respective wireless device in proximity to the travel route, wherein at least a portion of the location-based information that is associated with the location of the respective wireless device is communicated to the mobile computing system responsive to the mobile computing system scanning the visible marker that is proximal to the location; and
presenting, through the display, the location-based information received from each of the plurality of wireless devices, wherein the location-based information is presented according to the determined computing experience.
2. The method of claim 1, wherein determining the computing experience includes receiving a selection of the computing experience that is made by the user through the mobile computing system.
3. The method of claim 1, wherein:
at least two of the plurality of wireless devices are at different locations in proximity to the travel route; and
the respective location-based information received from each of the at least two wireless devices is presented, through the display, during different periods of time.
4. The method of claim 1, wherein:
the computing experience includes at least one regulation that indicates at least: i) a first type of content to be displayed while the computing experience is employed, and ii) a second type of content to not be displayed while the computing experience is employed; and
presenting the location-based information includes presenting the first type of content and not presenting the second type of content.
5. The method of claim 1, wherein at least a portion of the location-based information that is associated with the location of the respective wireless device is stored locally on the respective wireless device.
6. The method of claim 1, wherein the mobile computing system is a wearable computing device.
7. The method of claim 1, wherein the visible markers are different between all the locations so as to differentiate the locations from one another.
8. The method of claim 1, wherein at least the portion of the location-based information is communicated to the mobile computing system from a cloud-based storage system.
9. The method of claim 1, wherein the display of the mobile computing system is an augmented reality display or a mixed reality display;
the location-based information sent from the respective wireless device includes geometric information associated with a feature in proximity to the location of the respective wireless device; and
presenting the location-based information includes using the geometric information to present a virtual representation of the feature in the display.
11. The system of claim 10, wherein determining the computing experience includes receiving a selection of the computing experience that is made by the user through the mobile computing system.
12. The system of claim 10, wherein:
at least two of the plurality of wireless devices are at different locations in proximity to the travel route; and
the respective location-based information received from each of the at least two wireless devices is presented, through the display, during different periods of time.
13. The system of claim 10, wherein:
the computing experience includes at least one regulation that indicates at least: i) a first type of content to be displayed while the computing experience is employed, and ii) a second type of content to not be displayed while the computing experience is employed; and
presenting the location-based information includes presenting the first type of content and not presenting the second type of content.
14. The system of claim 10, wherein at least a portion of the location-based information that is associated with the location of the respective wireless device is stored locally on the respective wireless device.
15. The system of claim 10, wherein the mobile computing system is a wearable computing device.
16. The system of claim 10, wherein at least a portion of the location-based information that is associated with the location of the respective wireless device is communicated to the mobile computing system responsive to the mobile computing system scanning a marker that is proximal to the location.
17. The system of claim 16, wherein at least the portion of the location-based information is communicated to the mobile computing system from a cloud-based storage system.
18. The system of claim 10, wherein:
the display of the mobile computing system is an augmented reality display or a mixed reality display;
the location-based information sent from the respective wireless device includes geometric information associated with a feature in proximity to the location of the respective wireless device; and
presenting the location-based information includes using the geometric information to present a virtual representation of the feature in the display.

The present disclosure is continuation of U.S. patent application Ser. No. 17/257,814, filed on Jan. 4, 2021, which is a national phase of International Patent Application No: PCT/US2019/040544, filed on Jul. 3, 2019, which claims benefit of priority under 35 U.S.C. § 119(e) of U.S. Provisional Patent Application Ser. No. 62/693,891, titled “Systems and Methods for Virtual and Augmented Reality,” which was filed on Jul. 3, 2018. The entire contents of these priority documents are incorporated by reference into the present disclosure.

This invention is related to connected mobile computing systems, methods, and configurations, and more specifically to mobile computing systems, methods, and configurations featuring at least one wearable component which may be utilized for virtual and/or augmented reality operation.

It is desirable that mixed reality, or augmented reality, near-eye displays be lightweight, low-cost, have a small form-factor, have a wide virtual image field of view, and be as transparent as possible. In addition, it is desirable to have configurations that present virtual image information in multiple focal planes (for example, two or more) in order to be practical for a wide variety of use-cases without exceeding an acceptable allowance for vergence-accommodation mismatch. Referring to FIG. 1, an augmented reality system is illustrated featuring a head-worn viewing component (2), a hand-held controller component (4), and an interconnected auxiliary computing or controller component (6) which may be configured to be worn as a belt pack or the like on the user. Each of these components may be operatively coupled (10, 12, 14, 16, 17, 18) to each other and to other connected resources (8) such as cloud computing or cloud storage resources via wired or wireless communication configurations, such as those specified by IEEE 802.11, Bluetooth®, and other connectivity standards and configurations. The augmented reality system can include the two depicted optical elements (20) through which the user may see the world around them along with visual components which may be produced by the associated system components, for an augmented reality experience. There is a need for compact and persistently connected systems and assemblies which are optimized for use in wearable computing systems.

The present disclosure is generally directed to presenting information through a display of a mobile computing system. More specifically, the present disclosure describes, according to various embodiments, receiving information that is wirelessly transmitted to the mobile computing system by wireless devices that are proximal to a route being travelled by the mobile computing system, and presenting at least a portion of the received information through a display of the mobile computing system according to a computing experience determined for a user of the mobile computing system.

Embodiments of the present disclosure include a method performed by the mobile computing system, the method including the following operations: determining a computing experience for a user of the mobile computing system; establishing a wireless connection between the mobile computing system and each of a plurality of wireless devices, wherein each respective wireless device is in proximity to a travel route of the mobile computing system; receiving, over each respective wireless connection between the mobile computing system and a respective wireless device, location-based information that is associated with a location of the respective wireless device in proximity to the travel route; and presenting, through a display of the mobile computing system, the location-based information received from each of the plurality of wireless devices, wherein the location-based information is presented according to the determined computing experience.

Embodiments of the present disclosure can also optionally include one or more of the following aspects: determining the computing experience includes receiving a selection of the computing experience that is made by the user through the mobile computing system; determining the computing experience includes accessing at least one data source that stores a schedule of the user, and inferring the computing experience based at least partly on the schedule of the user; at least two of the plurality of wireless devices are at different locations in proximity to the travel route; the respective location-based information received from each of the at least two wireless devices is presented, through the display, during different periods of time; the computing experience includes at least one regulation that indicates at least: i) a first type of content to be displayed while the computing experience is employed, and ii) a second type of content to not be displayed while the computing experience is employed; presenting the location-based information includes presenting the first type of content and not presenting the second type of content; at least a portion of the location-based information that is associated with the location of the respective wireless device is stored locally on the respective wireless device; the mobile computing system is a wearable computing device; at least a portion of the location-based information that is associated with the location of the respective wireless device is communicated to the mobile computing system responsive to the mobile computing system scanning a beacon that is proximal to the location; at least the portion of the location-based information is communicated to the mobile computing system from a cloud-based storage system; the display of the mobile computing system is an augmented reality display or a mixed reality display; the location-based information sent from the respective wireless device includes geometric information associated with a feature in proximity to the location of the respective wireless device; and/or presenting the location-based information includes using the geometric information to present a virtual representation of the feature in the display.

FIG. 1 shows an example computing system.

FIGS. 2-5 show example environments in which a mobile computing system can operate according to embodiments of the present disclosure.

FIG. 6-8 show schematics of example wireless devices according to embodiments of the present disclosure.

FIG. 9 shows an example marker which can be employed according to embodiments of the present disclosure.

Referring to FIG. 2, a travelling scenario (160) is depicted wherein a user of a mobile computing system, such as the wearable computing system described in reference to FIG. 1, operates in the world. FIG. 2 illustrates a home (22) of the user which features at least one wireless device (40) configured to connect the user's wearable computing system. As the user navigates the world around him, here in an illustrative example day wherein the user travels (30) from home (22; point A—80) to work (24; points B—82, C—84, D—86, E—88); then from work (24) he travels (32; points I—96, J—98) to a park (26) for a walk (28; points K—100, L—102, M—104) before the rest of the return (34; points N—106, O—108) to home (22)—along the way coming into wireless contact between his mobile computing system and various wireless devices (40, 42, 44, 46, 48, 50, 52, 54, and others as shown in magnified views of FIG. 3 and FIG. 4). Preferably the mobile computing system is configured to utilize various wireless devices and information exchanged therewith to provide the user with a relatively low-latency and robust connectivity experience, generally subject to user preferences which may be selected by the user.

In some embodiments, the mobile computing system can be an augmented reality, or mixed reality, system as described, for example, in U.S. patent application Ser. Nos. 14/555,585, 14/690,401, 14/331,218, 15/481,255, and 62/518,539, each of which is incorporated by reference herein in its entirety.

In one embodiment, the mobile computing system may be configured such that the user selects certain aspects of his computing experience for the day. For example, through a graphical user interface, voice controls, and/or gestures, the user may input to the mobile computing system that he'll have a typical work day, usual route there, stopping at park for brief walk on the way home. Preferably the mobile computing system has certain artificial intelligence aspects so that it may use integration with his electronic calendar to provisionally understand his schedule, subject to quick confirmations. For example, as he is departing for work, the system may be configured to say or show: “headed to work; usual route and usual computing configuration”, and this usual route may be garnered from previous GPS and/or mobile triangulation data through his mobile computing system. The “usual computing configuration” may be customized by the user and subject to regulations; for example, the system may be configured to only present certain non-occlusive visuals, no advertisements, and no shopping or other information not pertinent to driving while the user is driving, and to provide an audio version of a news program or current favorite audiobook while the user is driving on his way to work. As the user navigates the drive on the way to work, he may leave connectivity with his home wireless device (40) and enter or maintain connectivity with other wireless devices (42, 44, 46, 48). Each of these wireless devices may be configured to provide the user's mobile computing system with information pertinent to the user's experience at relatively low latency (e.g., by storing locally certain information which may be pertinent to the user at that location). FIGS. 6 and 7 illustrate certain aspects of wireless devices which may be utilized as described herein; the embodiments of FIGS. 8 and 9 feature non-storage beacon and/or marker configurations which also may be utilized to connect directly to locally-pertinent cloud-based information without the benefit of local storage.

For example, as the user travels from point A (80) to point B (82) to point C (84), a local wireless device (44) around point C (84) may be configured to pass to the user's mobile system geometric information which may be utilized on the user's mobile computing system for highlighting where a trench is being created at such location, so that the user clearly visualizes and/or understands the hazard while driving past, and this geometric information (which may feature a highlighted outline of the trench, for example; may also feature one or more photos or other non-geometric information) maybe locally stored on the local wireless device (44) so that it does not need to be pulled from more remote resources which may involve greater latency in getting the information to the driver. In addition to lowering latency, local storage also may function to decrease the overall compute load on the user's mobile computing system, because the mobile system may receive information that it otherwise would have had to generate or build itself based upon sensors, for example, which may comprise part of the locally mobile hardware.

Once the user arrives at the parking lot of his work (24), the system may, for example, be configured to detect walking velocity and to be configured by the user to review with the user his schedule for the day, via an integration with his computerized calendaring system, as he is walking up to the office. Certain additional information not resident on his locally mobile computing system may be pulled from local sources (48, 50, for example) which may feature certain storage capacity, to again facilitate smaller mobile overhead and lower latency versus direct cloud connectivity.

Referring to FIG. 4, once in the office (24), the user may connect with a variety of wireless devices (50, 60, 62, 64, 66, 68, 70, 72, 74), each of which may be configured to be able to provide location-based information. For example, when at point F (90), the user's mobile computing system may be configured to detect the location (such as by GPS, computer vision, marker or beacon identification, and/or wireless device (60, 62, 64) triangulation) and then quickly upload from local storage (e.g., from a wireless device 60, 62, 64) to his mobile computing system information pertinent to that location, such as a dense triangular mesh of the geometry of the room, or certain information pertaining to whose office that room is, information about that person, or other information that may be deemed relevant, such as by an artificial intelligence agent working automatically on the user's mobile computing system. Various other wireless devices (50, 66, 68, 70, 72, 74) may be positioned in other locations of the office and configured to feature other location-based information, again to provide local users with low-latency and robust mobile computing functionality without everything, such as determination of the room geometry, being done de novo by the sensor facilities local to the mobile computing system in real time. Referring to FIG. 3, similar wireless device resources (40, 56, 58) may be utilized in the home (22) to assist with location-based information as the user navigates (P—110, Q—112, R—114, S—116, T—118, U—120) the home with his mobile computing system. In the office (24) or home (22) environments, the mobile computing system may be configured to utilize external resources quite differently from driving. For example, the artificial intelligence component of the user's mobile computing system may be aware that the user likes to watch nightly news highlights from the previous week (perhaps in a display manner that would ordinarily not be acceptable when driving, but is acceptable when walking, or perhaps automatically expanding when the user stops walking around and is seated or standing mobile) as he is walking around on Saturday mornings between 7 and 8 am, and so when walking velocity is detected, the system may be configured to deliver such highlights from local storage between those hours, while also gathering other location-based information such as the position of various objects or structures within the house (e.g., to decrease computer vision processing load) in the pertinent location.

Similarly, as the user navigates a walk (28) through the park (26), shown in magnified view in FIG. 5, local wireless device resources (54) may be utilized to provide location-based information, such as background information related to a sculpture garden that the user may be observing as he walks along; such information may be displayed or reproduced as audio as the user is walking around in a manner that is tailored and/or customizable to his walking-in-a-park scenario (e.g., as opposed to driving, or walking around in the home or work).

Referring to FIG. 6, in one embodiment, one of more of the aforementioned wireless devices (40, 42, 44, 46, 48, 50, 52, 54, and others as shown in magnified views of FIG. 3 and FIG. 4) may comprise a system as shown in FIG. 6, wherein a local controller (134), such as a processor, is operatively coupled (138) to a power supply (132), such as a battery, a transceiver (130), such as a transmitting and receiving antenna set, and a local storage device (136), such as a mass storage or memory device. The storage device (136) may be operatively coupled (140) to external storage resources (146), such as cloud storage resources; the local power supply (132) may be operatively coupled (142) to external power resources (148), such as for long term charging or replenishment; the transceiver (130) may be operatively coupled to external connectivity resources (150) to provide access, for example, to the internet backbone. All of these local and connected resources may be configured based upon the location of such device, to provide local users with information tailored to the local scenario, whether such information is pertinent to traffic, shopping, weather, structures, culture, etc. FIG. 7 illustrates an embodiment similar to that of FIG. 6, but without local storage facility—the components thereof are operatively coupled (141) to remote storage resources (146), such as cloud resources; such an embodiment as in FIG. 7 maybe utilized in various configurations in place of embodiments such as those in FIG. 6, without the benefit of directly local storage (as described above, such local storage may be beneficial in reducing latency in terms of providing information to a mobile system in the area). Referring to FIG. 8, in further scenarios without local storage capability, a transmitter beacon (41) type of device, for example featuring only a transmitter (131, not a two-way transceiver) and a relatively long-term battery (132), may be utilized to connect to a locally positioned mobile computing device to share location or beacon identification information the functions as a pointer to connect mobile computing system with pertinent cloud resources (e.g., bypassing local storage, but providing information akin to: you are here+pointers to cloud resources that are pertinent). Referring to FIG. 9, in a very basic scenario, a non-electronic marker (43), such as an aruco marker, may be utilized to also function as a pointer to connect mobile computing system with pertinent cloud resources (e.g., bypassing local storage, but providing information akin to: you are here+pointers to cloud resources that are pertinent).

As described above, to decrease latency and generally increase useful access to pertinent location-based information, wireless devices with localized storage resources, such as those depicted in FIG. 6, may be located throughout the interiors of structures such as homes, enterprises, etc.—and also exteriors, such as urban downtown areas, outsides of stores or shops, etc. Similarly, wireless devices without localized storage capacity—but operatively coupled to, or pointed to, remote storage resources, also may be located throughout the interiors of structures such as homes, enterprises, etc.—and also exteriors, such as urban downtown areas, outsides of stores or shops, etc.

In one embodiment, the mobile computing system may be customizable by the user to present information filtered on a time-domain basis, such as by how old or “stale” such information is. For example, the user may be able to configure the system to only provide traffic information while he is driving that is 10 minutes old or newer, etc. (e.g., the time domain aspect may be customized/configurable); or the user may be able to configure the system to only present architectural (e.g., position of walls within a building) that is one year old or newer etc. (e.g., the time domain aspect may be customized/configurable).

Various example embodiments of the invention are described herein. Reference is made to these examples in a non-limiting sense. They are provided to illustrate more broadly applicable aspects of the invention. Various changes may be made to the invention described and equivalents may be substituted without departing from the true spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation, material, composition of matter, process, process act(s) or step(s) to the objective(s), spirit or scope of the present invention. Further, as will be appreciated by those with skill in the art that each of the individual variations described and illustrated herein has discrete components and features which may be readily separated from or combined with the features of any of the other several embodiments without departing from the scope or spirit of the present inventions. All such modifications are intended to be within the scope of claims associated with this disclosure.

The invention includes methods that may be performed using the subject devices. The methods may comprise the act of providing such a suitable device. Such provision may be performed by the end user. In other words, the “providing” act merely requires the end user obtain, access, approach, position, set-up, activate, power-up or otherwise act to provide the requisite device in the subject method. Methods recited herein may be carried out in any order of the recited events which is logically possible, as well as in the recited order of events.

Example aspects of the invention, together with details regarding material selection and manufacture have been set forth above. As for other details of the present invention, these may be appreciated in connection with the above-referenced patents and publications as well as generally known or appreciated by those with skill in the art. The same may hold true with respect to method-based aspects of the invention in terms of additional acts as commonly or logically employed.

In addition, though the invention has been described in reference to several examples optionally incorporating various features, the invention is not to be limited to that which is described or indicated as contemplated with respect to each variation of the invention. Various changes may be made to the invention described and equivalents (whether recited herein or not included for the sake of some brevity) may be substituted without departing from the true spirit and scope of the invention. In addition, where a range of values is provided, it is understood that every intervening value, between the upper and lower limit of that range and any other stated or intervening value in that stated range, is encompassed within the invention.

Also, it is contemplated that any optional feature of the inventive variations described may be set forth and claimed independently, or in combination with any one or more of the features described herein. Reference to a singular item, includes the possibility that there are plural of the same items present. More specifically, as used herein and in claims associated hereto, the singular forms “a,” “an,” “said,” and “the” include plural referents unless the specifically stated otherwise. In other words, use of the articles allow for “at least one” of the subject item in the description above as well as claims associated with this disclosure. It is further noted that such claims may be drafted to exclude any optional element. As such, this statement is intended to serve as antecedent basis for use of such exclusive terminology as “solely,” “only” and the like in connection with the recitation of claim elements, or use of a “negative” limitation.

Without the use of such exclusive terminology, the term “comprising” in claims associated with this disclosure shall allow for the inclusion of any additional element—irrespective of whether a given number of elements are enumerated in such claims, or the addition of a feature could be regarded as transforming the nature of an element set forth in such claims. Except as specifically defined herein, all technical and scientific terms used herein are to be given as broad a commonly understood meaning as possible while maintaining claim validity.

The breadth of the present invention is not to be limited to the examples provided and/or the subject specification, but rather only by the scope of claim language associated with this disclosure.

Lundmark, David Charles

Patent Priority Assignee Title
Patent Priority Assignee Title
10018844, Feb 09 2015 Microsoft Technology Licensing, LLC Wearable image display system
10082865, Sep 29 2015 Rockwell Collins, Inc. Dynamic distortion mapping in a worn display
10151937, Oct 07 2005 Percept Technologies Inc Digital eyewear
10185147, Oct 07 2005 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
10218679, Sep 27 2017 Citrix Systems, Inc. Secure single sign on and conditional access for client applications
10241545, Jun 01 2017 META PLATFORMS TECHNOLOGIES, LLC Dynamic distortion correction for optical compensation
10317680, Nov 09 2017 META PLATFORMS TECHNOLOGIES, LLC Optical aberration correction based on user eye position in head mounted displays
10436594, Jan 17 2017 Blind InSites, LLC Devices, systems, and methods for navigation and usage guidance in a navigable space using wireless communication
10516853, Oct 10 2018 PlutoVR Aligning virtual representations to inputs and outputs
10551879, Jun 01 2017 META PLATFORMS TECHNOLOGIES, LLC Dynamic distortion correction for optical compensation
10578870, Jul 26 2017 CITIBANK, N A Exit pupil expander
10698202, Apr 07 2015 CITIBANK, N A Compound diffraction grating and method of manufacture
10825424, Jun 05 2018 CITIBANK, N A Homography transformation matrices based temperature calibration of a viewing system
10856107, Aug 19 2013 Estimote Polska Sp z o.o. System and method for providing content using beacon systems
10987176, Jun 19 2018 HOWMEDICA OSTEONICS CORP Virtual guidance for orthopedic surgical procedures
11190681, Jul 10 2015 SNAP INC Systems and methods for DSP fast boot
11209656, Oct 05 2020 META PLATFORMS TECHNOLOGIES, LLC Methods of driving light sources in a near-eye display
11236993, Jun 08 2017 META PLATFORMS TECHNOLOGIES, LLC Depth sensing using a time of flight system including a scanning beam in combination with a single photon avalanche diode array
4344092, Oct 21 1980 CIRCON CORPORATION, A CORP OF CA Miniature video camera means for video system
4652930, Nov 19 1984 RCA Corporation Television camera structure
4810080, Sep 03 1987 Cabot Safety Intermediate Corporation Protective eyewear with removable nosepiece and corrective spectacle
4997268, Jul 24 1989 Corrective lens configuration
5007727, Feb 26 1990 Combination prescription lens and sunglasses assembly
5074295, Aug 03 1989 Jamie, Inc. Mouth-held holder
5240220, Sep 12 1990 ELBEX VIDEO LTD TV camera supporting device
5251635, Sep 03 1991 General Electric Company Stereoscopic X-ray fluoroscopy system using radiofrequency fields
5410763, Feb 11 1993 ETABLISSMENTS BOLLE S N C Eyeshield with detachable components
5455625, Sep 23 1993 Rosco Inc. Video camera unit, protective enclosure and power circuit for same, particularly for use in vehicles
5495286, Jul 22 1991 Sterile video microscope holder for operating room
5497463, Sep 25 1992 BULL HN INFORMATION SYSTEMS INC Ally mechanism for interconnecting non-distributed computing environment (DCE) and DCE systems to operate in a network system
5682255, Feb 26 1993 Yeda Research & Development Co. Ltd. Holographic optical devices for the transmission of optical signals of a plurality of channels
5689669, Apr 29 1994 Intellectual Ventures I LLC Graphical user interface for navigating between levels displaying hallway and room metaphors
5826092, Sep 15 1995 Gateway, Inc Method and apparatus for performance optimization in power-managed computer systems
5854872, Oct 08 1996 Clio Technologies, Inc. Divergent angle rotator system and method for collimating light beams
5864365, Jan 26 1996 Exelis Inc Environmentally controlled camera housing assembly
5937202, Feb 11 1993 3-D Computing, Inc.; 3D-COMPUTING, INC High-speed, parallel, processor architecture for front-end electronics, based on a single type of ASIC, and method use thereof
6002853, Oct 26 1995 TVL LP System for generating graphics in response to a database search
6012811, Dec 13 1996 CONTOUR OPTIK, INC Eyeglass frames with magnets at bridges for attachment
6016160, Mar 31 1993 DIVERSIFIED OPTICAL PRODUCTS, INC Combination head-protective helmet and thermal imaging apparatus
6064749, Aug 02 1996 CHAPEL HILL, UNIVERSITY OF NORTH CAROLINA, THE Hybrid tracking for augmented reality using both camera motion detection and landmark tracking
6076927, Jul 10 1998 Adjustable focal length eye glasses
6079982, Dec 31 1997 Interactive simulator ride
6117923, Apr 22 1997 Mitsubishi Gas Chemical Company, Inc. Resin for optical material
6119147, Jul 28 1998 MAJANDRO LLC Method and system for computer-mediated, multi-modal, asynchronous meetings in a virtual space
6124977, Nov 12 1997 Olympus Corporation Image display apparatus
6179619, May 13 1997 Game machine for moving object
6191809, Jan 15 1998 VIKING SYSTEMS, INC Method and apparatus for aligning stereo images
6219045, Nov 12 1996 WORLDS INC Scalable virtual world chat client-server system
6243091, Nov 21 1997 International Business Machines Corporation Global history view
6271843, May 30 1997 ACTIVISION PUBLISHING, INC Methods systems and computer program products for transporting users in three dimensional virtual reality worlds using transportation vehicles
6362817, May 18 1998 IN3D Corporation System for creating and viewing 3D environments using symbolic descriptors
6375369, Apr 22 1999 Videolarm, Inc. Housing for a surveillance camera
6385735, Dec 15 1997 Sony Corporation of America Method and apparatus for limiting processor clock frequency
6396522, Mar 08 1999 Dassault Systemes Selection navigator
6414679, Oct 08 1998 CyberWorld International Corporation Architecture and methods for generating and displaying three dimensional representations
6538655, Aug 29 1997 Kabushiki Kaisha Sega Enterprises Image processing system and image processing method
6541736, Dec 10 2001 Usun Technology Co., Ltd. Circuit board/printed circuit board having pre-reserved conductive heating circuits
6570563, Jul 12 1995 Sony Corporation Method and system for three-dimensional virtual reality space sharing and for information transmission
6573903, May 08 1995 Autodesk, Inc. Determining and displaying geometric relationships between objects in a computer-implemented graphics system
6590593, Apr 06 1999 Microsoft Technology Licensing, LLC Method and apparatus for handling dismissed dialogue boxes
6621508, Jan 18 2000 Seiko Epson Corporation Information processing system
6690393, Dec 24 1999 Koninklijke Philips Electronics N V 3D environment labelling
6757068, Jan 28 2000 THALES VISIONIX, INC Self-referenced tracking
6784901, May 09 2000 Leidos, Inc Method, system and computer program product for the delivery of a chat message in a 3D multi-user environment
6961055, May 09 2001 Crytek IP Holding LLC Methods and apparatus for constructing virtual environments
7046515, Jun 06 2002 OL SECURITY LIMITED LIABILITY COMPANY Method and apparatus for cooling a circuit component
7051219, Sep 01 2000 LG-ERICSSON CO , LTD System and apparatus for adjusting a clock speed based on a comparison between a time required for a scheduler function to be completed and a time required for an execution condition to be satisfied
7076674, Dec 19 2001 HEWLETT-PACKARD DEVELOPMENT COMPANY L P Portable computer having dual clock mode
7111290, Jan 28 1999 ADVANCED SILICON TECHNOLOGIES, LLC Profiling program execution to identify frequently-executed portions and to assist binary translation
7119819, Apr 06 1999 Microsoft Technology Licensing, LLC Method and apparatus for supporting two-dimensional windows in a three-dimensional environment
7219245, Jun 03 2004 Advanced Micro Devices, Inc. Adaptive CPU clock management
7382288, Jun 30 2004 Rockwell Collins, Inc Display of airport signs on head-up display
7414629, Mar 11 2002 Microsoft Technology Licensing, LLC Automatic scenery object generation
7431453, Mar 19 2003 OPTHALMIC ENGINEERING PTY LTD Modular eyewear system
7467356, Jul 25 2003 Three-B International Limited Graphical user interface for 3d virtual display browser using virtual display windows
7542040, Aug 11 2004 The United States of America as represented by the Secretary of the Navy Simulated locomotion method and apparatus
7573640, Apr 04 2005 Mirage Innovations Ltd. Multi-plane optical apparatus
7653877, Apr 28 2000 Sony Corporation Information processing apparatus and method, and storage medium
7663625, Mar 23 2001 Dassault Systemes Collaborative design
7724980, Jul 24 2006 Adobe Inc System and method for selective sharpening of images
7746343, Jun 27 2005 GOOGLE LLC Streaming and interactive visualization of filled polygon data in a geographic information system
7751662, Jan 24 2008 Carl Zeiss Jena GmbH Optical display device
7758185, Oct 07 2005 Percept Technologies Inc Digital Eyewear
7788323, Sep 21 2000 WARGAMING NET LIMITED Method and apparatus for sharing information in a virtual environment
7804507, Jul 27 2006 Electronics and Telecommunications Research Institute Face-mounted display apparatus for mixed reality environment
7814429, Jun 14 2006 Dassault Systemes Computerized collaborative work
7817150, Sep 30 2005 ROCKWELL AUTOMATION TECHNOLOGIES, INC Three-dimensional immersive system for representing an automation control environment
7844724, Oct 24 2007 SOCOCO, INC Automated real-time data stream switching in a shared virtual area communication environment
8060759, Jun 29 2007 EMC IP HOLDING COMPANY LLC System and method of managing and optimizing power consumption in a storage system
8120851, Jan 11 2007 Canon Kabushiki Kaisha Optical material and optical element, diffraction optical element, and stacked type diffraction optical element molded thereof
8214660, Jul 26 2006 CAVIUM INTERNATIONAL; MARVELL ASIA PTE, LTD Structure for an apparatus for monitoring and controlling heat generation in a multi-core processor
8246408, Jun 13 2008 BARCO, INC Color calibration system for a video display
8353594, Oct 07 2005 Percept Technologies Inc Digital eyewear
8360578, Jan 26 2006 CITIBANK, N A Eye tracker device
8508676, Nov 11 2009 IMAX Corporation Phase-compensated anti-reflective thin flim coating
8547638, Jun 02 2006 CITIBANK, N A Color distribution in exit pupil expanders
8605764, Jul 09 2012 Microvision, Inc. Laser diode junction temperature compensation
8619365, Dec 29 2004 Corning Incorporated Anti-reflective coating for optical windows and elements
8696113, Oct 07 2005 PERCEPT TECHNOLOGIES INC.; Percept Technologies Inc Enhanced optical and perceptual digital eyewear
8698701, Jun 20 2005 SAMSUNG ELECTRONICS CO , LTD Field sequential light source modulation for a digital display system
8733927, Oct 07 2005 PERCEPT TECHNOLOGIES INC. Enhanced optical and perceptual digital eyewear
8736636, Jan 29 2010 MEC RESOURCES LLC Apparatus and method for providing augmented reality information
8759929, Mar 24 2009 Kabushiki Kaisha Toshiba Solid-state imaging device
8793770, Sep 02 2010 MEC RESOURCES LLC Method for authorizing use of augmented reality (AR) information and apparatus
8823855, Oct 13 2010 MEC RESOURCES LLC User equipment and method for providing augmented reality (AR) service
8847988, Sep 30 2011 Microsoft Technology Licensing, LLC Exercising applications for personal audio/visual system
8874673, Sep 15 2011 MEC RESOURCES LLC Mobile terminal, server, and method for establishing communication channel using augmented reality (AR)
9010929, Oct 07 2005 Percept Technologies Inc Digital eyewear
9015501, Jul 13 2006 International Business Machines Corporation Structure for asymmetrical performance multi-processors
9086537, Nov 17 2010 Canon Kabushiki Kaisha Laminated diffractive optical element
9095437, Apr 14 2009 GEARBOX, LLC Adjustable orthopedic implant and method for treating an orthopedic condition in a subject
9239473, Oct 07 2005 PERCEPT TECHNOLOGIES INC. Digital eyewear
9244293, Oct 07 2005 PERCEPT TECHNOLOGIES INC. Digital eyewear
9244533, Dec 17 2009 Microsoft Technology Licensing, LLC Camera navigation for presentations
9383823, May 29 2009 Microsoft Technology Licensing, LLC Combining gestures beyond skeletal
9489027, Aug 31 2015 Wave Resource Strategies, Inc. System and method for the accurate recordation of power consumption in a computing device utilizing power profiles
9519305, Dec 21 2009 Mercury Kingdom Assets Limited Processor core clock rate selection
9581820, Jun 04 2012 Microsoft Technology Licensing, LLC Multiple waveguide imaging structure
9582060, Aug 31 2006 ADVANCED SILICON TECHNOLOGIES, LLC Battery-powered device with reduced power consumption based on an application profile data
9658473, Oct 07 2005 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
9671566, Jul 12 2013 CITIBANK, N A Planar waveguide apparatus with diffraction element(s) and system employing same
9671615, Dec 01 2015 Microsoft Technology Licensing, LLC Extended field of view in near-eye display using wide-spectrum imager
9696795, Feb 13 2015 Ultrahaptics IP Two Limited; LMI LIQUIDATING CO , LLC Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
9798144, Sep 12 2012 Sony Corporation Wearable image display device to control display of image
9874664, Jan 31 2013 Adlens Limited Actuation of fluid-filled lenses
9880441, Sep 08 2016 CITIBANK, N A Electrochromic systems for head-worn computer systems
9918058, Jul 01 2014 Sony Corporation Information processing to allow projector units to project images in cooperation
9955862, Mar 17 2015 Raytrx, LLC System, method, and non-transitory computer-readable storage media related to correction of vision defects using a visual display
9978118, Jan 25 2017 Microsoft Technology Licensing, LLC No miss cache structure for real-time image transformations with data compression
9996797, Oct 31 2013 Ultrahaptics IP Two Limited; LMI LIQUIDATING CO , LLC Interactions with virtual objects for machine control
20010010598,
20010018667,
20020007463,
20020063913,
20020071050,
20020095463,
20020108064,
20020113820,
20020122648,
20020140848,
20030028816,
20030048456,
20030067685,
20030077458,
20030115494,
20030218614,
20030219992,
20030226047,
20040001533,
20040021600,
20040025069,
20040042377,
20040073822,
20040073825,
20040111248,
20040113887,
20040174496,
20040186902,
20040193441,
20040201857,
20040238732,
20040240072,
20040246391,
20040268159,
20050001977,
20050034002,
20050093719,
20050128212,
20050157159,
20050177385,
20050231599,
20050273792,
20060013435,
20060015821,
20060019723,
20060038880,
20060050224,
20060090092,
20060126181,
20060129852,
20060132914,
20060179329,
20060221448,
20060228073,
20060250322,
20060259621,
20060268220,
20070058248,
20070103836,
20070124730,
20070159673,
20070188837,
20070198886,
20070204672,
20070213952,
20070283247,
20080002259,
20080002260,
20080030429,
20080043334,
20080046773,
20080063802,
20080068557,
20080125218,
20080146942,
20080173036,
20080177506,
20080205838,
20080215907,
20080225393,
20080235570,
20080246693,
20080316768,
20090076791,
20090091583,
20090153797,
20090224416,
20090245730,
20090287728,
20090300528,
20090310633,
20100005326,
20100019962,
20100056274,
20100063854,
20100070378,
20100079841,
20100115428,
20100153934,
20100194632,
20100205541,
20100214284,
20100232016,
20100232031,
20100244168,
20100274567,
20100274627,
20100277803,
20100284085,
20100296163,
20110010636,
20110021263,
20110022870,
20110041083,
20110050640,
20110050655,
20110122240,
20110145617,
20110170801,
20110218733,
20110286735,
20110291969,
20120011389,
20120050535,
20120075501,
20120081392,
20120089854,
20120113235,
20120127062,
20120154557,
20120218301,
20120246506,
20120249416,
20120249741,
20120260083,
20120307075,
20120307362,
20120314959,
20120320460,
20120326948,
20130021486,
20130050258,
20130050642,
20130050833,
20130051730,
20130061240,
20130077049,
20130077170,
20130094148,
20130129282,
20130162940,
20130169923,
20130205126,
20130222386,
20130268257,
20130278633,
20130314789,
20130318276,
20130336138,
20130342564,
20130342570,
20130342571,
20130343408,
20140002329,
20140013098,
20140016821,
20140022819,
20140078023,
20140082526,
20140119598,
20140126769,
20140140653,
20140149573,
20140168260,
20140266987,
20140267419,
20140274391,
20140282105,
20140313228,
20140340449,
20140359589,
20140375680,
20150005785,
20150009099,
20150077312,
20150097719,
20150123966,
20150130790,
20150134995,
20150138248,
20150155939,
20150168221,
20150205126,
20150235427,
20150235431,
20150253651,
20150256484,
20150269784,
20150294483,
20150301955,
20150310657,
20150338915,
20150355481,
20160004102,
20160015470,
20160027215,
20160033770,
20160077338,
20160085285,
20160085300,
20160091720,
20160093099,
20160093269,
20160123745,
20160139402,
20160139411,
20160155273,
20160180596,
20160187654,
20160191887,
20160202496,
20160217624,
20160266412,
20160267708,
20160274733,
20160287337,
20160300388,
20160321551,
20160327798,
20160334279,
20160357255,
20160370404,
20160370510,
20170038607,
20170060225,
20170061696,
20170064066,
20170100664,
20170102544,
20170115487,
20170122725,
20170123526,
20170127295,
20170131569,
20170147066,
20170160518,
20170161951,
20170185261,
20170192239,
20170201709,
20170205903,
20170206668,
20170213388,
20170214907,
20170219841,
20170232345,
20170235126,
20170235129,
20170235142,
20170235144,
20170235147,
20170243403,
20170246070,
20170254832,
20170256096,
20170258526,
20170266529,
20170270712,
20170281054,
20170287376,
20170293141,
20170307886,
20170307891,
20170312032,
20170322418,
20170322426,
20170329137,
20170332098,
20170336636,
20170357332,
20170363871,
20170371394,
20170371661,
20180014266,
20180024289,
20180044173,
20180052007,
20180052501,
20180059305,
20180067779,
20180070855,
20180082480,
20180084245,
20180088185,
20180102981,
20180108179,
20180114298,
20180129112,
20180131907,
20180136466,
20180144691,
20180150971,
20180151796,
20180172995,
20180188115,
20180189568,
20180190017,
20180191990,
20180218545,
20180250589,
20180284877,
20180292654,
20180299678,
20180357472,
20190005069,
20190011691,
20190056591,
20190087015,
20190101758,
20190107723,
20190137788,
20190155034,
20190155439,
20190158926,
20190162950,
20190167095,
20190172216,
20190178654,
20190182415,
20190196690,
20190206116,
20190219815,
20190243123,
20190287270,
20190318502,
20190318540,
20190321728,
20190347853,
20190380792,
20190388182,
20200066045,
20200098188,
20200100057,
20200110928,
20200117267,
20200117270,
20200184217,
20200184653,
20200202759,
20200242848,
20200309944,
20200356161,
20200368616,
20200391115,
20200409528,
20210008413,
20210033871,
20210041951,
20210053820,
20210093391,
20210093410,
20210093414,
20210097886,
20210132380,
20210142582,
20210158627,
20210173480,
20220366598,
CN101449270,
CN104040410,
CN104603675,
CN105190427,
CN106662754,
CN107683497,
EP504930,
EP535402,
EP632360,
EP1215522,
EP1237067,
EP1494110,
EP1938141,
EP1943556,
EP2290428,
EP2350774,
EP2723240,
EP2896986,
EP3139245,
EP3164776,
EP3236211,
GB2499635,
GB2542853,
IN938DEL2004,
JP10333094,
JP2002529806,
JP2003029198,
JP2003141574,
JP2003228027,
JP2003329873,
JP2005303843,
JP2007012530,
JP2007273733,
JP200786696,
JP2008257127,
JP2009090689,
JP2009244869,
JP2010014443,
JP2010139575,
JP2011033993,
JP2011257203,
JP2012015774,
JP2012235036,
JP2013525872,
JP2014192550,
JP2014500522,
JP2015191032,
JP2016502120,
JP2016516227,
JP201685463,
JP2017015697,
JP2017153498,
JP2017531840,
JP3036974,
JP6232763,
JP6333965,
KR101372623,
KR1020060059992,
KR20050010775,
TW201219829,
TW201803289,
WO1991000565,
WO2000030368,
WO2002071315,
WO2004095248,
WO2006132614,
WO2007037089,
WO2007085682,
WO2007102144,
WO2008148927,
WO2009101238,
WO2012030787,
WO2013049012,
WO2013062701,
WO2014033306,
WO2014203440,
WO2015143641,
WO2016054092,
WO2017004695,
WO2017044761,
WO2017049163,
WO2017120475,
WO2017176861,
WO2017203201,
WO2018008232,
WO2018022523,
WO2018031261,
WO2018039273,
WO2018044537,
WO2018057564,
WO2018085287,
WO2018087408,
WO2018097831,
WO2018166921,
WO2019148154,
WO2020010226,
/////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jul 11 2019LUNDMARK, DAVID CHARLESMAGIC LEAP, INCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0609000846 pdf
Aug 25 2022Magic Leap, Inc.(assignment on the face of the patent)
Feb 01 2023MAGIC LEAP, INCCITIBANK, N A , AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0626810065 pdf
Feb 01 2023Mentor Acquisition One, LLCCITIBANK, N A , AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0626810065 pdf
Feb 01 2023Molecular Imprints, IncCITIBANK, N A , AS COLLATERAL AGENTSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0626810065 pdf
Date Maintenance Fee Events
Aug 25 2022BIG: Entity status set to Undiscounted (note the period is included in the code).


Date Maintenance Schedule
Dec 26 20264 years fee payment window open
Jun 26 20276 months grace period start (w surcharge)
Dec 26 2027patent expiry (for year 4)
Dec 26 20292 years to revive unintentionally abandoned end. (for year 4)
Dec 26 20308 years fee payment window open
Jun 26 20316 months grace period start (w surcharge)
Dec 26 2031patent expiry (for year 8)
Dec 26 20332 years to revive unintentionally abandoned end. (for year 8)
Dec 26 203412 years fee payment window open
Jun 26 20356 months grace period start (w surcharge)
Dec 26 2035patent expiry (for year 12)
Dec 26 20372 years to revive unintentionally abandoned end. (for year 12)