A new tracking technique is essentially "sourceless" in that it can be used anywhere with no set-up, yet it enables a much wider range of virtual environment-style navigation and interaction techniques than does a simple head-orientation tracker. A sourceless head orientation tracker is combined with a head-worn tracking device that tracks a hand-mounted 3D beacon relative to the head. The system encourages use of intuitive interaction techniques which exploit proprioception.

Patent
   6757068
Priority
Jan 28 2000
Filed
Jan 26 2001
Issued
Jun 29 2004
Expiry
May 13 2022
Extension
472 days
Assg.orig
Entity
Large
428
6
all paid
52. A method comprising:
using acoustic or radio frequency signals to track a position of a first localized feature associated with a limb of the user relative to the user's head.
1. A method comprising
mounting a sourceless orientation tracker on a user's head, and
using a position tracker to track a position of a first localized feature associated with a limb of the user relative to the user's head.
54. A tracking system comprising
a sourceless orientation tracker for mounting on a user's head, and
a position tracker adapted to track a position of a first localized feature associated with a limb of the user relative to the user's head.
56. A system comprising:
mounting a first inertial sensor on a user's head;
mounting a second inertial sensor elsewhere on the user's body or in an object held by the user; and
tracking the position of one inertial sensor relative to the other.
53. A tracking system comprising:
an acoustic or radio frequency position tracker adapted for mounting on a user's head,
said tracker being adapted to track a position of a first localized feature associated with a limb of the user relative to the user's head.
55. A method comprising:
mounting a motion tracker on a user's head;
using a position tracker to track a position of a first localized feature associated with a limb of the user relative to the user's head;
positioning the first localized feature at a first point;
positioning the first localized feature at a second point; and
calculating the distance between the first point and the second point.
2. The method of claim 1 in which the first localized feature associated with the limb comprises a point on a hand-held object or a point on a hand-mounted object or a point on a hand.
3. The method of claim 2, wherein the first localized feature is on a stylus-shaped device.
4. The method of claim 2, wherein the first localized feature is on a ring.
5. The method of claim 1 further comprising using the position tracker to determine a distance between the first localized feature and a second localized feature associated with the user's head.
6. The method of claim 1 in which the position tracker comprises an acoustic position tracker.
7. The method of claim 1 in which the position tracker comprises an electro-optical system that tracks LEDs, optical sensors or reflective marks.
8. The method of claim 1 in which the position tracker comprises a video machine-vision device that recognizes the first localized feature.
9. The method of claim 1 in which the position tracker comprises a magnetic tracker with a magnetic source held in the hand and sensors integrated in the headset or vice versa.
10. The method of claim 1 in which the position tracker comprises a radio frequency position locating device.
11. The method of claim 1 in which the sourceless orientation tracker comprises an inertial sensor.
12. The method of claim 1 in which the sourceless orientation tracker comprises a tilt-sensor.
13. The method of claim 1 in which the sourceless orientation tracker comprises a magnetic compass sensor.
14. The method of claim 1 further comprising:
mounting a display device on the user's head; and
displaying a first object at a first position on the display device.
15. The method of claim 14 further comprising:
changing the orientation of the display device; and
after changing the orientation of the display device, redisplaying the first object at a second position on the display device based on the change in orientation.
16. The method of claim 15, wherein the second position is determined so as to make the position of the first object appear to be fixed relative to a first coordinate reference frame, which frame does not rotate with the display device during said changing of the orientation of the display device.
17. The method of claim 16, wherein the first object is displayed in response to a signal from a computer.
18. The method of claim 17, further comprising:
mounting a wearable computer on the user's body, and wherein the first object is displayed in response to a signal from the wearable computer.
19. The method of claim 15, further comprising displaying a portion of a virtual environment on the display device.
20. The method of claim 19, further comprising:
displaying a portion of the virtual environment on the display device before changing the orientation of the display device, and displaying a different portion of the virtual environment on the display device after changing the orientation of the display device.
21. The method of claim 19, in which the virtual environment is a fly-through virtual environment.
22. The method of claim 19, in which the virtual environment includes a virtual treadmill.
23. The method of claim 15, further comprising displaying a graphical user interface for a computer on the display device.
24. The method of claim 23, wherein the first object is a window, icon or menu in the graphical user interface.
25. The method of claim 23, wherein the first object is a pointer for the graphical user interface.
26. The method of claim 16, further comprising:
changing the position of the first localized feature relative to the position tracker; and
after changing the position of the first localized feature, redisplaying the first object at a second position on the display device determined based on the change in the position of the first localized feature.
27. The method of claim 26, further comprising:
displaying a second object on the display device, wherein
after changing the position of the first localized feature, the displayed position of the second object on the display device does not change in response to the change in the position of the first localized feature.
28. The method of claim 26, wherein the second position is determined so as to make the position of the first object appear to coincide with the position of the first localized feature as seen or felt by the user.
29. The method of claim 17, further comprising:
changing the orientation of the first coordinate reference frame in response to a signal being received by the computer.
30. The method of claim 29, wherein the orientation of the first coordinate reference frame is changed in response to a change in the position of the first localized feature.
31. The method of claim 29, wherein the orientation of the first coordinate reference frame is changed in response to a signal representative of the location of the user.
32. The method of claim 29, wherein the orientation of the first coordinate reference frame is changed in response to a signal representative of a destination.
33. The method of claim 29, wherein the orientation of the first coordinate reference frame is changed in response to a signal representative of a change in the user's immediate surroundings.
34. The method of claim 29, wherein the orientation of the first coordinate reference frame is changed in response to a signal representative of a change in the physiological state or physical state of the user.
35. The method of claim 27, wherein redisplaying the first object further comprises changing the apparent size of the first object according to the change in position of the first localized feature.
36. The method of claim 1, further comprising:
mounting a portable beacon, transponder or passive marker at a fixed point in the environment; and
determining the position vector of a second localized feature associated with the user's head relative to the fixed point.
37. The method of claim 36, further comprising determining the position vector of the first localized feature relative to the fixed point.
38. The method of claim 36, wherein the position vector is determined without determining the distance between the second localized feature and more than one fixed point in the environment.
39. The method of claim 36, wherein the position vector is determined without determining the distance between the second localized feature and more than two fixed points in the environment.
40. The method of claim 36, further comprising:
mounting a sourceless orientation tracker on a second user's head; and
determining the position of a localized feature associated with the body of the second user relative to the fixed point.
41. The method of claim 16, further comprising:
displaying the first object at a third position;
after displaying the first object at the third position, changing the orientation of the display; and
after changing the orientation of the display, continuing to display the first object at the third position.
42. The method of claim 27, wherein the first object is a window in a wraparound computer interface.
43. The method of claim 26, wherein said changed position of the first localized feature is not within the field of view of the display when the first object is redisplayed.
44. The method of claim 43, further comprising:
displaying the first object at an apparent position coinciding with the position of the first localized object when the first localized object is within the field of view of the display.
45. The method of claim 1, further comprising:
positioning the first localized feature at a first point;
positioning the first localized feature at a second point; and
calculating the distance between the first point and the second point.
46. The method of claim 1, further comprising:
determining a position vector of the first localized feature relative to a second localized feature associated with the user's head; and
transforming the position vector based on an orientation of the user's head.
47. The method of claim 46, further comprising:
setting an assumed position for the user's head in a coordinate system; and
setting a position for the first localized feature in the coordinate system based on the assumed position of the user's head and said position vector.
48. The method of claim 47, where setting a position for the first localized feature further comprises:
measuring the orientation of the user's head relative to a fixed frame of reference.
49. The method of claim 47, further comprising:
setting a virtual travel speed and direction for the user; and
modifying the assumed position for the user's head based on the user's virtual travel speed and direction.
50. The method of claim 1, wherein the sourceless orientation tracker comprises a first inertial sensor, and further comprising:
mounting a second inertial sensor elsewhere on the user's body or in an object held by the user; and
tracking the position of one inertial sensor relative to the other.
51. The method of claim 14, further comprising:
mounting a video camera on the back of the user's head; and
displaying an image generated by the video camera in a portion of the display device.
57. The method of claim 56, further comprising:
sensing data at the first and second inertial sensors and using the sensed data to track the position of one inertial sensor relative to the other.
58. The method of claim 57, wherein tracking the position of the inertial sensor is done without reference to any signal received from a source not mounted on or held by the user.
59. The method of claim 58, wherein the drift of the relative position or orientation of the second inertial sensor relative to the first inertial sensor is corrected by measurements between devices on the user's head and devices elsewhere on the users body.

This application claims priority under 35 USC §119(e) to provisional U.S. Patent Application Ser. No. 60/178,797, filed on Jan. 28, 2000, the entire contents of which are hereby incorporated by reference.

This invention relates to self-referenced tracking.

Virtual reality (VR) systems require tracking of the orientation and position of a user's head and hands with respect to a world coordinate frame in order to control view parameters for head mounted devices (HMDs) and allow manual interactions with the virtual world. In laboratory VR setups, this tracking has been achieved with a variety of mechanical, acoustic, magnetic, and optical systems. These systems require propagation of a signal between a fixed "source" and the tracked "sensor" and therefore limit the range of operation. They also require a degree of care in setting up the source or preparing the site that reduces their utility for field use.

The emerging fields of wearable computing and augmented reality (AR) require tracking systems to be wearable and capable of operating essentially immediately in arbitrary environments. "Sourceless" orientation trackers have been developed based on geomagnetic and/or inertial sensors. They allow enough control to look around the virtual environment and fly through it, but they don't enable the "reach-out-and-grab" interactions that make virtual environments so intuitive and which are needed to facilitate computer interaction.

In one aspect, in general, the invention provides a new tracking technique that is essentially "sourceless" in that it can be used anywhere with no set-up of a source, yet it enables a wider range of virtual environment-style navigation and interaction techniques than does a simple head-orientation tracker, including manual interaction with virtual objects. The equipment can be produced at only slightly more than the cost of a sourceless orientation tracker and can be used by novice end users without any knowledge of tracking technology, because there is nothing to set up or configure.

In another aspect, in general, the invention features mounting a tracker on a user's head and using the tracker to track a position of a localized feature associated with a limb of the user relative to the user's head. The localized feature associated with the limb may include a hand-held object or a hand-mounted object or a point on a hand.

In another aspect, in general, the invention features mounting a sourceless orientation tracker on a user's head and using a position tracker to track a position of a first localized feature associated with a limb of the user relative to the user's head.

In another aspect, in general, the invention features tracking a point on a hand-held object such as a pen or a point on a hand-mounted object such as a ring or a point on a hand relative to a user's head.

In another aspect, in general, the invention features using a position tracker to determine a distance between a first localized feature associated with a user's limb and a second localized feature associated with the user's head.

In another aspect, in general, the invention features a position tracker which includes an acoustic position tracker, an electro-optical system that tracks LEDs, optical sensors or reflective marks, a video machine-vision device, a magnetic tracker with a magnetic source held in the hand and sensors integrated in the headset or vice versa, or a radio frequency position locating device.

In another aspect, in general, the invention features a sourceless orientation tracker including an inertial sensor, a tilt-sensor, or a magnetic compass sensor.

In another aspect, in general, the invention features mounting a display device on the user's head and displaying a first object at a first position on the display device.

In another aspect, in general, the invention features changing the orientation of a display device, and, after changing the orientation of the display device, redisplaying the first object at a second position on the display device based on the change in orientation.

In another aspect, in general, the invention features determining the second position for displaying the first object so as to make the position of the first object appear to be fixed relative to a first coordinate reference frame, which frame does not rotate with the display device during said changing of the orientation of the display device.

In another aspect, in general, the invention features displaying the first object in response to a signal from a computer.

In another aspect, in general, the invention features mounting a wearable computer on the user's body, and displaying a first object in response to a signal from the wearable computer.

In another aspect, in general, the invention features displaying at least a portion of a virtual environment, such as a fly-through virtual environment, or a virtual treadmill, on the display device.

In another aspect, in general, the invention features displaying a graphical user interface for a computer on the display device.

In another aspect, in general, the invention features first object being a window, icon or menu in the graphical user interface.

In another aspect, in general, the invention features the first object being a pointer for the graphical user interface.

In another aspect, in general, the invention features changing the position of the first localized feature relative to the position tracker and, after changing the position of the first localized feature, redisplaying the first object at a second position on the display device determined based on the change in the position of the first localized feature.

In another aspect, in general, the invention features displaying a second object on the display device, so that after changing the position of the first localized feature, the displayed position of the second object on the display device does not change in response to the change in the position of the first localized feature.

In another aspect, in general, the invention features determining the second position so as to make the position of the first object appear to coincide with the position of the first localized feature as seen or felt by the user.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame in response to a signal being received by the computer.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame in response to a change in the position of the first localized feature.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame in response to a signal representative of the location of the user.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame in response to a signal representative of a destination.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame in response to a signal representative of a change in the user's immediate surroundings.

In another aspect, in general, the invention features changing the orientation of the first coordinate reference frame is changed in response to a signal representative of a change in the physiological state or physical state of the user.

In another aspect, in general, the invention features redisplaying the first object further comprises changing the apparent size of the first object according to the change in position of the first localized feature.

In another aspect, in general, the invention features mounting a portable beacon, transponder or passive marker at a fixed point in the environment and determining the position vector of a second localized feature associated with the user's head relative to the fixed point.

In another aspect, in general, the invention features determining the position vector of the first localized feature relative to the fixed point.

In another aspect, in general, the invention features mounting a sourceless orientation tracker on a second user's head and determining the position of a localized feature associated with the body of the second user relative to the fixed point.

In another aspect, in general, the invention features determining the position vector of a second localized feature associated with the user's head relative to the fixed point without determining the distance between the second localized feature and more than one fixed point in the environment.

In another aspect, in general, the invention features displaying the first object at a third position after displaying the first object at the third position, changing the orientation of the display, and after changing the orientation of the display, continuing to display the first object at the third position.

In another aspect, in general, the invention features the first object being a window in a wraparound computer interface.

In another aspect, in general, the invention features redisplaying the changed position of the first localized feature not being within the field of view of the display when the first object is redisplayed.

In another aspect, in general, the invention features displaying the first object at a position coinciding with the position of the first localized object when the first localized object is within the field of view of the display.

In another aspect, in general, the invention features positioning the first localized feature at a first point positioning the first localized feature at a second point and calculating the distance between the first point and the second point.

In another aspect, in general, the invention features determining a position vector of the first localized feature relative to a second localized feature associated with the user's head and modifying the position vector based on an orientation of the user's head.

In another aspect, in general, the invention features setting an assumed position for the user's head in a coordinate system and setting a position for the first localized feature in the coordinate system based on the assumed position of the user's head and said position vector.

In another aspect, in general, the invention features measuring the orientation of the user's head relative to a fixed frame of reference.

In another aspect, in general, the invention features setting a virtual travel speed and direction for the user modifying the assumed position for the user's head based on the user's virtual travel speed and direction.

In another aspect, in general, the invention features mounting on the head of a user a three degree of freedom orientation tracker for tracking the orientation of the head, and a three degree of freedom position tracker for tracking the position of a first localized feature on the user's limb relative to a second localized feature on the user's head, computing a position vector for the first localized feature relative to the second localized feature, determining a rotation matrix based on information received from the rotation tracker, and transforming the position vector into a position vector for a fixed frame of reference based on the rotation matrix.

In another aspect, in general, the invention features using an acoustic or radio frequency position tracker to track a position of a first localized feature associated with a limb of the user relative to the user's head.

In another aspect, in general, the invention features mounting a video camera on the back of the user's head and displaying an image generated by the video camera in a portion of a display device mounted on the user's head.

In another aspect, in general, the invention features mounting a first inertial sensor on a user's head, mounting a second inertial sensor elsewhere on the user's body or in an object held by the user, and tracking the position of one inertial sensor relative to the other.

Some embodiments of the invention include sensing data at the first and second inertial sensors and using the sensed data to track the position of one inertial sensor relative to the other, tracking the position of the inertial sensor is done without reference to any signal received from a source not mounted on or held by the user and correcting the drift of the relative position or orientation of the second inertial sensor relative to the first inertial sensor by measurements between devices on the user's head and devices elsewhere on the users body.

Among the advantages of the invention are one or more of the following. The device is easy to don, can track both head and hand, adds no new cables to a wearable computer system, works anywhere indoors or outdoors with no preparation, and is simpler than alternatives such as vision-based self-tracking.

The details of one or more embodiments of the invention are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the invention will be apparent from the description and drawings, and from the claims.

FIG. 1 is a perspective view of a self-referenced tracking device mounted on a head.

FIG. 2 is a block diagram.

FIG. 3 is a graph of tracking coverage and relative resolution.

FIG. 4 is a view of an information cockpit.

FIG. 5 shows a user using a virtual reality game.

Like reference symbols in the various drawings indicate like elements.

As seen in FIG. 1, implementations of the invention may combine a sourceless head orientation tracker 30 with a head-worn tracking device 12 that tracks a hand-mounted 3D beacon 14 relative to the head 16. One implementation uses a wireless ultrasonic tracker 12, which has the potential for low cost, lightweight, low power, good resolution, and high update rates when tracking at the relatively close ranges typical of head-hand displacements.

As FIG. 1 illustrates, this arrangement provides a simple and easy to don hardware system. In a fully integrated wearable VR system using this tracker there are only three parts (a wearable computer 10, a headset 15 with an integrated tracking system, and a hand-mounted beacon 14) and one cable connection 18. This is possible because the entire ultrasonic receiver system 12 for tracking the beacon can be reduced to a few small signal-conditioning circuits and integrated with the sourceless orientation tracker 30 in the head-worn display 15. By sharing the microprocessor and its power and communications link to the wearable, the cost and complexity are reduced.

The benefits of this combination of elements stem from these realizations:

1. It is usually not important to track the hand unless it is in front of the head. Thus range and line-of-sight limitations are no problem if the tracker is mounted on the forehead.

2. The hand position measured in head space can be transformed into world space with good seen/felt position match using an assumed head pose, no matter how inaccurate.

3. Using one fixed beacon, the same tracking hardware can provide full 6-DOF tracking.

Implementations of the invention may exhibit:

1. A new tracking concept that enables immersive visualization and intuitive manual interaction using a wearable system in arbitrary unprepared environments.

2. An information cockpit metaphor for a wearable computer user interface and a set of interaction techniques based on this metaphor.

As shown in FIG. 2, a simple proof-of-concept implementation combines an InterSense IS-300 sourceless inertial orientation tracker 40 (available from InterSense, Inc., in Burlington, Mass.) with a Pegasus FreeD ultrasonic position tracker 50 (available from Pegasus Technologies Ltd. in Holon, Israel). The IS-300 has an "InertiaCube" inertial sensor assembly 42, just over an inch on a side, cabled to a small computational unit 44 that outputs orientation data through a serial port 46. The FreeD product consists of a finger-worn wireless ultrasonic emitter 50A with two mouse buttons 54, and an L-shaped receiver bar 50B which normally mounts on the frame of a computer monitor, and outputs x,y,z data through a serial port. For our experiments we mounted the InertiaCube and the L-shaped receiver bar on the visor 60 of a V-Cap 1000 see-through HMD (available from Virtual Vision of Seattle, Wash.). The FreeD therefore measures the ring position relative to the head-fixed coordinate frame whose orientation was measured by the IS-300.

Data from both trackers is transmitted to a PC 62 (Pentium 300 MHz, Windows 98) running a program 63 that uses Windows DirectX and Direct3D capabilities to display graphics and effect interaction techniques. The graphics output window of Direct3D is maximized to take control over the entire screen, and VGA output 64 (640×480 at 60 Hz) is passed into the V-Cap HMD as well as a desktop monitor.

The program 63 includes a tracker driver 71 and a fairly conventional VR rendering environment 72 that expects to receive 6-DOF head and hand tracking data from the tracker driver as well as button states 65 for the hand tracking device. The interaction techniques to be described are implemented in the tracker driver. The basic functions of the tracker driver, when tracking a single 3-DOF point on the hand, are:

1. Read in and parse the orientation data 68 from the IS-300 and the position triad 70 from the FreeD.

2. Package the orientation data with the current head position in world-frame, and output the combined 6-DOF data record 73 for the head to the VR program. The current assumed world-frame head position is the same as the previous one unless the user is in the process of performing a navigation interaction such as flying. In this case the position is incremented based on the flying speed and direction.

3. Transform the hand position vector from head frame to world frame by first multiplying by the rotation matrix from head to world frame obtained from the orientation tracker, then adding the current assumed world-frame head position. Output the result to the VR program as a 3-DOF position record 74 for the hand device.

The simple implementation just described is wearable, but cannot be integrated into an HMD elegantly, largely due to the size and power consumption of the IS-300 processing unit. A low-cost wearable version using available technologies could be implemented as follows:

The core of this implementation is an inertial head orientation module called InterTrax 2 (available from InterSense and designed for use with consumer HMDs such as the Sony Glasstron and Olympus EyeTrek). Using tiny piezoelectric camcorder gyros, and solid-state accelerometers and magnetometers, InterTrax 2 is designed as a single long narrow circuit board 30 (FIG. 1) to lie across the top of the head mounted display unit along the brow line. It is 9 cm long, 2 cm wide, and 0.5 cm thick with all components, except for a vertical gyro in the center, which sticks up 1 cm higher. It contains a low-power embedded 16-bit processor that runs a simplified fixed-point version of the GEOS drift-corrected orientation-tracking algorithm used in the IS-300. It communicates to the host through a single USB connector through which it draws its power, and can be manufactured for very low cost in volume. It is expected to achieve accuracy on the order of 2-3°C, which is sufficient because the accuracy with which the hand avatar follows the physical hand is totally independent of orientation tracking accuracy.

Another component is an embedded ultrasonic rangefinder (perhaps based on the Pegasus FreeD technology). As shown in FIG. 1, three microphones 80, 82, 84 and their ultrasonic pulse detection circuits together with the InterTrax 2 board are embedded in a rigid plastic assembly designed to fit elegantly over the brow of an HMD. (In some embodiments, all components would be embedded inside the HMD display unit while sharing the HMD's cable 18, but in others, the added components are clipped on) The InterTrax 2 processor has enough unused timer inputs and processing bandwidth to timestamp the signals from the three ultrasonic pulse detectors and relay this data down its USB link.

The ultrasonic tracking technology can be modified to take advantage of the very short range requirements. First, ultrasonic frequency may be increased from 40 KHz to a higher frequency. This increases the attenuation in air, and virtually eliminates reverberation and interference between nearby users. Second, the system can take advantage of the much reduced reverberation and the short time-of-flight to increase the update rate of tracking to, say, 240 Hz, thus allowing the system to average 4 position samples for each 60 Hz graphics update, or track up to 4 beacons at 60 Hz. To calculate the resolution that this would yield in various parts of the tracking volume we calculated the Geometric Dilution of Precision (GDOP) throughout the tracking volume given the intended geometry of the microphone mounts on the headset. The intended headset geometry, tracking range and optical field of view are illustrated superimposed on an isogram of a vertical slice through the GDOP data in FIG. 3. The plane of the microphones is angled downward 45°C to insure that the system has tracking coverage for hands in the lap. The resolution at any point in space is the range measurement resolution (about 0.1 mm for short range ultrasonic measurements using 40 KHz) multiplied by the GDOP value, divided by 2 as a result of the 4× oversampling and averaging. Thus the expected resolution is approximately 0.5 mm at a distance of 400 mm away from the headset.

A goal of a wearable computer is to keep the user's hands free to perform tasks. For this reason, the system uses a wireless 3-DOF ring pointer for interaction. The FreeD ring-mouse previously described is approximately the right size. In some implementations of the system, the tracker will need to be triggered by a unique IR code from the headset, so that multiple beacons can be tracked.

In interactive visualization and design (IVD) and many other VR applications, a pen-style input device may be more useful. An implementation could use a wireless 5-DOF pen using the same basic technology as the 3-DOF ring pointer, but employing two emitters that are activated in an alternating sequence. A compact omni-directional pen could be implemented using cylindrical radiating ultrasonic transducers that have been developed by Virtual Ink (Boston, Mass.), mounted at the ends of a cylindrical electronics unit approximately the size of a normal pen, with two mouse buttons.

An additional device that could be included in the system and whose applications are discussed below is a small wireless anchor beacon that can be easily stuck to any surface. Ultrasonic beacons from InterSense are of suitable size and functionality.

Portable VR Application

Object Selection and Manipulation Exploiting Proprioception

M. Mine, F. Brooks, and C. Sequin. (Moving Objects in Space: Exploiting Proprioception in Virtual Environment Interaction. In SIGGRAPH 97 Conference Proceedings, ACM Annual Conference Series, August, 1997), have discussed the benefits of designing virtual environment interaction techniques that exploit our proprioceptive sense of the relative pose of our head, hands and body. A variety of techniques were presented, such as direct manipulation of objects within arms reach, scaled-world grab, hiding tools and menus on the users body, and body-relative gestures.

Implementations of the invention have advantages over conventional world-frame tracking systems for implementing these techniques effectively. With conventional trackers, any error in head orientation tracking will cause significant mismatch between the visual representation of the virtual hand and the felt position of the real hand, making it difficult to accurately activate hidden menus while the virtual hand is not in view. With implementations of the invention, the head orientation accuracy is immaterial and visual-proprioceptive match will be good to the accuracy of the ultrasonic tracker--typically 1-2 mm.

Locomotion & View Control Tricks

This section describes a few techniques to permit user locomotion and view control.

Flying and Scaled-world Grab

The usual navigation interface device in fly-through virtual environments is a joystick. This is appropriate for a flight simulator, but reduces one's sense of presence in terrestrial environments, where turning one's body toward the destination is more instinctive than turning the world until the destination is in front. Implementations of the invention support this more immersive type of flying. No matter how one turns, if she raises a hand in front of her it will be trackable, and can be used to control flight speed and direction. Better yet, she can use two-handed flying, which can be performed with the arms in a relaxed position and allows backwards motion, or the scaled-world grab method to reach out to a distant object and pull oneself to it in one motion.

Walking Using Head Accelerometers as a Pedometer

For exploratory walk-throughs, the sense of presence is greatest for walking, somewhat reduced for walking-in-place, and much further reduced for flying. M. Slater, A. Steed and M. Usoh (The Virtual Treadmill: A Naturalistic Metaphor for Navigation in Immersive Virtual Environments. In First Eurographics Workshop on Virtual Reality, M. Goebel Ed. 1993), and M. Slater, M. Usoh and A. Steed (Steps and Ladders in Virtual Reality. In Proc. Virtual Reality Software & Technology 94, G. Singh, S. K. Feiner, and D. Thalmann, Eds. Singapore: World Scientific, pages 45-54, August 1994) have described a "virtual treadmill" technique in which a neural network is trained to recognize the bouncing pattern of a position tracker on an HMD, and thus control virtual motion. Inertial head-orientation trackers do not normally output the position obtained by double integrating the accelerometers, because it drifts too much to be useful, but it seems reasonable that pattern analysis of the acceleration signals would produce good results.

Head-Motion Parallax Using Anchor Beacon

When working with close objects, head motion parallax is an important visual cue. It can be achieved with the tracking system of the invention on demand by using a trick. Normally, the system uses the 3-DOF position vector from the user's head to the hand-mounted beacon to track the position of the hand relative to the head, maintaining the head location fixed. When desired, the user may hold the hand still (say on a desk), and push a button to reverse this process, so that the tracker driver interprets the negative of the measured vector (in world frame) as a position update of the head relative to the stationary hand. He can then move his head back and forth to look around an object, and release the button when his viewpoint is repositioned for optimal viewing. After flying or walking to an area, this may be a convenient way of making finely controlled viewpoint adjustments using natural neck motion. Note that this operation is equivalent to grabbing the world and moving it around with one's hand, which may be a more convenient maneuver while standing.

Implementations of the invention can perform full 6-DOF head tracking using only one fixed reference point in the environment, while most acoustic and optical trackers require at least three. This works in the invention because head orientation is completely constrained by the sourceless head-tracker. This observation suggests another interesting trick. One may carry an extra wireless anchor beacon in a pocket and place it down on the table or stick it to a wall near a work area. Within range of this beacon, he can enjoy full 6-DOF tracking of both head and hand.

Wearable Computing Information Cockpit Interface

Information Cockpit Metaphor

In the field of wearable computing, three modes of displaying objects in a head-mounted display have been discussed. Head-stabilized objects are displayed at a fixed location on the HMD screen, so they move with your head motion and require no tracking. World-stabilized objects are fixed to locations in the physical environment. To cause them to stay fixed despite user head-motion requires full 6-DOF head tracking. Body-stabilized objects are displayed at a fixed location on the information surround, a kind of cylindrical or spherical bubble of information that follows the user's body position around. Head orientation tracking allows the user to look at different parts of the surroundings by turning his head, but position tracking is not needed.

Pure head-stabilized displays are usually used with small opaque monocular monitors mounted off to the side of the user's field of view. Without head tracking, this is better than having a display directly in front of the eye with information constantly blocking the frontal view. Use of this paradigm is widespread, and most of the wearable computer vendors provide this style of untracked sidecar display. This is roughly equivalent to wearing your desktop computer on your belt with the monitor mounted on a headband so that it is always available for hands-free viewing.

At the other end of the spectrum are world-stabilized AR displays, which must be implemented using see-through optics placed directly in front of the eyes. For a variety of applications such as surgery, construction and maintenance, this is a highly valuable capability. However, it requires sophisticated tracking and calibration, and is likely to remain a high-end subset of the total wearable computing market for quite a few years.

In the middle ground of complexity are the less common body-stabilized displays, which also tend to be implemented with see through HMDs. As implemented by S. Feiner, B. MacIntyre, M. Haupt, and E. Solomon (Windows on the World: 2D Windows for 3D Augmented Reality. In Proc. ACM UIST 93. ACM Press, November 1993) objects were drawn on a 170°C horizontal by 90°C vertical portion of a sphere. To prevent user disorientation, this hemispherical "virtual desk" was kept in front of the user's body by mounting an additional orientation tracker on the user's torso, and using the difference between the head yaw and torso yaw to pan the viewport. The desk was thus slaved to the user's torso, and the user could easily locate windows on it using his innate knowledge of head turn relative to the torso. This is intuitive but has the drawback that an additional orientation sensor must be mounted on the user's torso. This adds cost, makes the system more difficult to don, and causes the virtual desk to shift around in response to slight postural shifting of the user's torso, wobbling of the sensor mount, or metallic distortion of the relative magnetic heading between the two sensors. An implementation of the invention uses a variation on this theme, based on an "information cockpit" metaphor instead of a body-stabilized desk.

The information cockpit consists of a clear windshield, optionally drawn as a thin wireframe border, and a cluster of virtual instruments around it. As with the body-stabilized technique, the user's head is always in the center of the cockpit, but the heading direction of the cockpit stays fixed until the user changes it. Generally, the user first positions the windshield towards the objects he will be working on with his hands, and keeps the windshield area fairly clear of augmentations so that he can see what he is doing. Thereafter, the user can turn to look at the instruments, with or without turning his torso, and the instruments will not move. To prevent the user from becoming disoriented or being forced to strain his neck as he moves around, the implementation provides the user with steering techniques.

Outdoor Navigation Application

FIG. 4 shows an example of an information cockpit for an outdoor navigation application. The active field-of-view of the see-through HMD is indicated by heavy black rectangle 400. Thus only the augmentations within this rectangle are visible to the user, but rotating the head moves this active view port around the scene and reveals the other augmentations once they are inside of it. In this example there are a few frequently-used icons 401 that are fixed (i.e. head-stabilized) in the upper right of the heads-up display that will always be visible. There are additional icons 402 in the dashboard that are stabilized to the information cockpit, and therefore can only be seen when the user looks down a little to check them. Some of these are miniature information instruments, such as dials and gauges, while others are icons used to bring up larger information instruments such as a web browser or interactive map display. By clicking on the map icon on the dashboard, the full-size map application window 404 pops up in the middle of the active display area. The user may either quickly examine it then minimize it again, or save it for on-going reference by fixing it to a convenient spot on the information cockpit "windshield" 410 as has been done in FIG. 4. The user can see a corner of the map in the current view, but can look at the whole map again by looking up and to the right. Virtual rear view mirrors 406 (fed by a video camera on the back of the head) have likewise been placed in three locations on the virtual cockpit, but the user can re-position or close any of these four information instruments at any time. In this example, the heading direction of the cockpit is controlled by the application in order to guide the user to a destination. Using a GPS receiver in the user's wearable computer, the application orients the cockpit along the direction from the user's current position to the destination, so he need only follow the dotted lines 408 to their vanishing point on the horizon to walk in the correct direction. This provides a virtual sidewalk in the forest, much as pilots are guided by virtual tunnel-in-the-sky displays. In an urban setting, the computer would use map correlation to orient the cockpit along the current road in the suggested walking direction.

Steering and Interaction

The ring tracker can be used for several purposes in wearable computer applications: direct pointing to objects, virtual mouse pad cursor control, command gestures, and measuring or digitizing.

Direct Pointing to Objects

When the ring tracker enters the viewing frustum of the HMD, the cursor jumps to the location of the ring and follows it. This provides rapid direct selection of objects, taking full advantage of natural eye-hand coordination. In the virtual cockpit, one may glance up from the windshield to a side panel, see an instrument he wants to use, reach out to exactly where he sees it and click on it with one of the ring buttons to activate it or drag it into another view.

Many useful operations can be accomplished most easily with direct selection and manipulation of objects. You can move and resize windows (i.e. instruments) the usual 2D way by dragging their borders. However, you can also exploit the 3D tracking of the ring to simultaneously move and resize an instrument. Simply grab the title bar and pull it toward you to make it larger or away from you to make it smaller, while simultaneously positioning it. If you pull it in towards your head far enough, as if to attach it to your HMD, it will change colors, indicating that if you let go of it, it will remain as a head-stabilized object. This is effectively like grabbing an instrument off your cockpit panel and attaching it to your Heads-Up-Display (HUD) so that it will always be visible in the foreground no matter where you look. By pushing it away far enough it will convert back to a cockpit panel instrument.

One of the cockpit windows that can be manipulated in a similar manner is the windshield itself. Simply click on any clear area of the "glass" where there aren't any graphical objects you might accidentally select, then drag it left/right or up/down to rotate the whole cockpit in space. This is one way of "steering" the cockpit, which is particularly useful for small course corrections or size adjustments or to refocus your attention on another area of the workbench nearby.

Virtual Mouse Pad Cursor Control

Though fast and intuitive, the direct pointing technique would become very tiring if used to work with an instrument that requires extended repetitive clicking, such as a web browser or hypertext manual. A virtual mouse pad technique can overcome this problem. As soon as the user's hand drops below the viewing frustum of the HMD, the cursor control automatically switches into this mode, in which left-and-right motion of the ring moves the cursor left-and-right, in-and-out motion moves it up and down, and vertical position has no effect. This allows the user to rest his hand comfortably in his lap or on a desk, and control the cursor by sliding his hand horizontally a few inches as if on an imaginary mouse pad.

It is desirable that if the user positions the cursor on a particular object then moves his head without moving the ring, the cursor will remain on the object. This means that the cursor is drawn as an object in the cockpit-stabilized coordinates rather than the head-stabilized screen coordinates. This has several implications. First, the cursor is associated with a point on the spherical information cockpit surface, only a portion of which is visible in the HMD, so the cursor could be out of view and quite difficult to find. A wiggling gesture is then used to bring it back into the current center of display. Second, the ring tracking must be calculated in the cockpit stabilized coordinate frame, which means that if the user turns to the right, an "in-and-out" motion switches from cockpit x-axis to y-axis and has an unexpected effect. To avoid this, the ring position is transformed into cylindrical polar coordinates and the radial and tangential components are used to control cursor vertical and horizontal motion respectively.

Command Gestures

Ring tracker gestures may be used as a substitute for voice commands in situations where visual theatrics are more acceptable than audible ones, or where it is too noisy for reliable speech recognition. In general, gestures should commence outside of the direct pointing and virtual mouse pad regions, in order to avoid accidentally selecting and moving objects. This leaves the sides and top of the viewing frustum, and the first few inches in front of the face (which are not used for direct pointing). The gestures are executed by depressing a mouse button, possibly making a certain movement, then releasing the button. They are always relative to the head in order to exploit proprioception, and the fact that the head is tracked, while the rest of the body is not. Many gestures may be defined, but the most commonly needed is a boresight command to reset the heading direction of the cockpit to the current forward direction of the person's head as he walks about.

Measuring or Digitizing

Most people can hold their head very still, which opens the possibility that the ring tracker can be used to make measurements between two points that are close enough that both can be seen without moving the head. This might be useful in an application such as taking inventory of how many pieces of each size are in a stockroom. Likewise, an application might ask you to quickly digitize a few corners of a component so it can determine based on the dimensions what model of the component you are looking at and locate the appropriate manual pages.

To measure the distance between two close objects that are both within the display FOV at the same time, the user clicks both objects while holding his head still. The distance is computed as the norm of the difference of the two vector positions thus stored.

For two objects that are too far apart to be in the display FOV at once, a more elaborate procedure may be employed. The user first looks at the first object, positions the pointer beacon on it and depresses a button. At the moment the button is pressed, a world frame position vector (p1) of the first object is stored and then the tracking mode is switched to 6-DOF tracking of the head relative to the stationary hand-held pointer, as previously described. While holding the pointer stationary on the object and keeping the button depressed, the user then repositions his head until the second object is in view, releases the button, and holds his head still while moving the pointer to the second object, then clicking it to capture the second position vector (p2) in the same world coordinate frame as the first. This technique may be practiced either with a single pointing beacon operated by one hand, or using separate pointing beacons in each hand, to achieve approximately the same functionality as a conventional tape measure, but with the added benefit that the measurements are automatically stored on a digital computer.

Relationships among remote objects may also be measured using standard triangulation surveying methods, exploiting the functional similarity of a see-through HMD optic with orientation tracker to a surveyor's theodolite (although a tripod mounted theodolite is likely to be more accurate).

Mixed Display and AR-on-Demand Applications

The previous section presented the information cockpit as a specific variation on Feiner's body-stabilized information surround. However, the cockpit metaphor also allows the user to make use of the head-stabilized and world-stabilized coordinate frames at the same time. The previous section gave one example of this in which the pilot drags information from the cockpit onto the HUD, which makes it head-stabilized. For example, one may wish to have an alerting device always visible in the HUD that pops up notifications whenever a phone call, page or email is received, or when a scheduled meeting is about to begin, etc.

Likewise, one may wish to grab a certain instrument and paste it onto a physical object in worldspace. For example, while debugging a circuit board, you could overlay an interactive block diagram or schematic on the board, and attach a virtual scope trace to your hand that is holding the scope probe (which is possible because the hand is tracked by the ring pointer). To do this, you must first plant an anchor beacon, then click three corners of the circuit board to align the block diagram to it.

One important reason to plant anchor beacons is to create a shared AR workspace for communication or collaboration with coworkers as described in M. Billinghurst, S. Weghorst and T. Furness (Shared Space: An Augmented Reality Approach for Computer Supported Cooperative Work. Virtual Reality Vol. 3(1) 1998) and D. Schmalstieg, A. Fuhrmann, Z. Szalavari, and M. Gervautz (Studierstube: An Environment for Collaboration in Augmented Reality. In CVE 96 Workshop Proceedings, September, 1996) incorporated by reference. Imagine a paperless construction site with numerous workers building a structure according to the plans they are viewing on their wearable computers. It is nice that they don't have to drag large rolls of blueprints around, but they have no way to stand around a blueprint and point to things. The solution is for someone to drop two anchor pins on a table, defining the top two corners of a virtual blueprint or model that each person can see in correct perspective from his own vantage point.

A Variant Technique for Tracking the User's Hand

Some implementations of the invention use an inertial orientation sensor to track the rotation of the head, and an acoustic or optical position tracker to track the position of the hand relative to the head. For many applications, the performance of the acoustic or optical position tracker is sufficient. Furthermore, it has the great advantage that the item being tracked can be a small wireless transponder, or even a passive marker. For some applications, such as the ring-mounted pointing device for wearable computing, this is an overwhelming advantage.

However, for some applications, such as a virtual reality game, it may be desired to have the virtual object controlled by the hand tracker (e.g. a virtual sword or gun or racquet) respond to the hand motion with extremely fast smooth response. Acoustic, magnetic, or videometric hand trackers may introduce noticeable latency or jitter in these applications. Inertial position and orientation trackers are well known to provide extremely low latency and low jitter, but they require drift correction, especially if tracking position and not just orientation is desired. In a typical virtual reality application, the user's head and hand may both be tracked with 6 degrees of freedom relative to an external reference frame by using inertial sensors on the head and on the hand to measure their motion with a high update rate and low latency. The drift of these inertial sensors is corrected by making measurements with an ultrasonic, optical or magnetic tracking reference device mounted in the environment.

In some implementations of the present invention, the drift and latency issues can be addressed without the requirement of a reference device mounted in the environment. Foxlin, "Head-tracking Relative to a Moving Vehicle or Simulator Platform Using Differential Inertial Sensors," Proceedings of Helmet and Head-Mounted Displays V, SPIE vol. 4021 (2000) and co-pending U.S. patent application Ser. No. 09/556,135, which are incorporated herein by reference, describe techniques which enable the use of inertial sensors to track the motion of an object relative to a reference frame that is moving, even where the motion is not known completely. These techniques require that inertial sensors be attached to the moving body being used as the reference frame (in the cited references an example of the reference frame is given of a vehicle or motion-platform and an example of the tracked object is given as a head; in the present invention, the moving reference frame may be the user's head and the tracked object may be the user's hand or hand-mounted or hand-held object), as well as to the object being tracked (here, e.g., the user's hand). The techniques utilize angular rate and linear acceleration signals from the sourceless orientation trackers on the reference frame and on the tracked object to derive a differential inertial signal representative of the motion of the object relative to the frame. In embodiments of the present invention, this technique may be used to derive a differential inertial signal representative of the motion of the hand relative to the head.

FIG. 5 illustrates a user wearing a portable VR tennis game or training system. The computer and batteries are contained in backpack 502, which is cabled to HMD 500 to which are mounted inertial sensors 506 and ultrasonic transducers 510. He is holding a hand-held object 516, in this case a tennis racquet, to which are attached inertial sensors 508 and ultrasonic transducers 512. These hand-mounted devices may be powered by their own batteries and communicate by wireless means to the system on the users head and torso, or there may be an additional cable between the racquet and the backpack. The signals from inertial sensors 506 are processed by a first algorithm, preferably a drift-corrected inertial orientation tracking algorithm such as described in U.S. Pat. No. 5,645,077 to obtain a sourceless measurement of the head orientation. In addition, the signals from the hand-mounted inertial sensors 508 and the head-mounted inertial sensors 506 are jointly processed to track both the position and orientation of the hand relative to the head, preferably using an algorithm such as described in Foxlin (2000) and co-pending U.S. patent application Ser. No. 09/556,135. The drift of this relative inertial tracking is corrected by the relative range measurements 514. In the illustrated system there are also earphones 504 to provide 3D spatialized audio, and a haptic feedback device 518 to provide tactile feedback to the user when the virtual ball has hit the virtual racquet.

In general such a system may be used for other types of activities, such as a sword-fighting or gun-fighting game or trainer, a surgical trainer, an immersive design environment, a human-computer interface, or any other application known or not yet known which requires tracking of a user's head and one or more limbs or limb-mounted devices. While it is especially advantageous for mobile or portable applications in which the computer is wearable, this is not a requirement, and the user may be cabled to an off-body computer or communicate with an off-body computer through a wireless connection. In this case, it is still an advantage of the current invention that the tracking is accomplished without setting up an off-body reference device.

Other embodiments are within the scope of the claims.

The implementations described above track the hand with a head-mounted acoustic tracking system because this technology can be totally embedded in a lightweight headset and achieve high resolution tracking over a very wide FOV.

However, the head mounted position tracker need not be acoustic. It may be an electro-optical system which tracks LEDs, optical sensors, or reflective markers, or a video machine-vision device that recognizes the hands or fingers or some special markers mounted on the hands or fingers or handheld object, or even a magnetic tracker with a magnetic source held in the hand and sensors integrated in the headset or vice versa, or an RF position locating device.

The implementation described above use inertial sourceless orientation trackers. Other implementations may use other forms of head orientation trackers, including trackers based on tilt-sensing or magnetic compass sensors, or any other form of head orientation tracker. In fact, some implementations may use no head orientation tracker. In this case, the tracking system would not enable the user to look around in a virtual environment by turning his head, but it would still be useful for manual interaction with computers using head-worn displays.

A number of embodiments of the invention have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the invention.

Foxlin, Eric

Patent Priority Assignee Title
10010790, Apr 05 2002 MQ Gaming, LLC System and method for playing an interactive game
10013808, Feb 03 2015 Globus Medical, Inc Surgeon head-mounted display apparatuses
10022624, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
10030931, Dec 14 2011 Lockheed Martin Corporation Head mounted display-based training tool
10073516, Dec 29 2014 SONY INTERACTIVE ENTERTAINMENT INC Methods and systems for user interaction within virtual reality scene using head mounted display
10086282, Jul 27 2002 SONY INTERACTIVE ENTERTAINMENT INC Tracking device for use in obtaining information for controlling game program execution
10111620, Feb 27 2015 Microsoft Technology Licensing, LLC Enhanced motion tracking using transportable inertial sensors to determine that a frame of reference is established
10137365, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
10155170, Jun 05 2006 Nintendo Co., Ltd. Game operating device with holding portion detachably holding an electronic device
10179283, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
10180572, Feb 28 2010 Microsoft Technology Licensing, LLC AR glasses with event and user action control of external applications
10188953, Feb 22 2000 MQ Gaming, LLC Dual-range wireless interactive entertainment device
10238978, Aug 22 2005 Nintendo Co., Ltd. Game operating device
10268888, Feb 28 2010 Microsoft Technology Licensing, LLC Method and apparatus for biometric data capture
10292778, Apr 24 2014 Globus Medical, Inc. Surgical instrument holder for use with a robotic surgical system
10300374, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
10307671, Feb 22 2000 MQ Gaming, LLC Interactive entertainment system
10307683, Oct 20 2000 MQ Gaming, LLC Toy incorporating RFID tag
10318019, May 28 2004 UltimatePointer, L.L.C. Apparatus for controlling contents of a computer-generated image using three dimensional measurements
10324177, Apr 11 2011 Lone Star IP Holdings, LP Interrogator and system employing the same
10350013, Jun 21 2012 Globus Medical, Inc Surgical tool systems and methods
10352853, Jul 12 2017 PANASONIC INTELLECTUAL PROPERTY MANAGEMENT CO., LTD. Measuring device including light source that emits at least one light pulse group, photodetector, and control circuit
10357184, Jun 21 2012 Globus Medical, Inc Surgical tool systems and method
10357257, Jul 14 2014 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
10369463, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
10420616, Jan 18 2017 Globus Medical, Inc. Robotic navigation of robotic surgical systems
10444018, Feb 27 2015 Microsoft Technology Licensing, LLC Computer-implemented method to test the sensitivity of a sensor for detecting movement of a tracking device within an established frame of reference of a moving platform
10445899, Nov 26 2018 Capital One Services, LLC System and method for recalibrating an augmented reality experience using physical markers
10448910, Feb 03 2016 Globus Medical, Inc.; Globus Medical, Inc Portable medical imaging system
10478719, Apr 05 2002 MQ Gaming, LLC Methods and systems for providing personalized interactive entertainment
10484623, Dec 20 2016 Microsoft Technology Licensing, LLC Sensor with alternating visible and infrared sensitive pixels
10485617, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
10507387, Apr 05 2002 MQ Gaming, LLC System and method for playing an interactive game
10531927, Jun 21 2012 Globus Medical, Inc. Methods for performing invasive medical procedures using a surgical robot
10539787, Feb 28 2010 Microsoft Technology Licensing, LLC Head-worn adaptive display
10546423, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
10548620, Jan 15 2014 Globus Medical, Inc. Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
10555782, Feb 18 2015 Globus Medical, Inc. Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
10569794, Oct 13 2015 Globus Medical, Inc. Stabilizer wheel assembly and methods of use
10573023, Apr 09 2018 Globus Medical, Inc Predictive visualization of medical imaging scanner component movement
10580217, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
10583357, Mar 25 2003 MQ Gaming, LLC Interactive gaming toy
10624710, Jun 21 2012 Globus Medical, Inc. System and method for measuring depth of instrumentation
10628645, Mar 03 2004 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
10639112, Jun 21 2012 Globus Medical, Inc. Infrared signal based position recognition system for use with a robot-assisted surgery
10642041, Nov 07 2014 Samsung Electronics Co., Ltd Direction based electronic device for displaying object and method thereof
10646280, Jun 21 2012 Globus Medical, Inc.; Globus Medical, Inc System and method for surgical tool insertion using multiaxis force and moment feedback
10646283, Feb 19 2018 Globus Medical, Inc Augmented reality navigation systems for use with robotic surgical systems and methods of their use
10646298, Jul 31 2015 Globus Medical, Inc Robot arm and methods of use
10650594, Feb 03 2015 Globus Medical Inc. Surgeon head-mounted display apparatuses
10653497, Apr 11 2016 Globus Medical, Inc Surgical tool systems and methods
10660712, Apr 01 2011 Globus Medical Inc. Robotic system and method for spinal and other surgeries
10661183, Aug 22 2005 Nintendo Co., Ltd. Game operating device
10670707, Apr 11 2011 Lone Star IP Holdings, LP Interrogator and system employing the same
10675094, Jul 21 2017 Globus Medical Inc.; Globus Medical, Inc Robot surgical platform
10675507, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
10687779, Feb 03 2016 Globus Medical, Inc Portable medical imaging system with beam scanning collimator
10687905, Aug 31 2015 KB Medical SA Robotic surgical systems and methods
10747301, Mar 28 2017 CITIBANK, N A Augmented reality system with spatialized audio tied to user manipulated virtual object
10758315, Jun 21 2012 Globus Medical Inc.; Globus Medical, Inc Method and system for improving 2D-3D registration convergence
10758818, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
10765438, Jul 14 2014 KB Medical SA Anti-skid surgical instrument for use in preparing holes in bone tissue
10786313, Aug 12 2015 Globus Medical, Inc. Devices and methods for temporary mounting of parts to bone
10795183, Oct 07 2005 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
10795458, Aug 03 2018 CITIBANK, N A Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
10799298, Jun 21 2012 Globus Medical, Inc Robotic fluoroscopic navigation
10806471, Jan 18 2017 Globus Medical, Inc. Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
10813704, Oct 04 2013 KB Medical, SA Apparatus and systems for precise guidance of surgical tools
10816813, Apr 05 2019 Disney Enterprises, Inc.; DISNEY ENTERPRISES, INC Systems and methods for enhancing accuracy of spatial location and rotational orientation determination of wearable head-mounted display device
10818037, Nov 26 2018 Capital One Services, LLC System and method for recalibrating an augmented reality experience using physical markers
10828116, Apr 24 2014 KB Medical, SA Surgical instrument holder for use with a robotic surgical system
10828120, Jun 19 2014 KB Medical SA Systems and methods for performing minimally invasive surgery
10835326, Jun 21 2012 Globus Medical Inc. Surgical robot platform
10835328, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
10842453, Feb 03 2016 Globus Medical, Inc. Portable medical imaging system
10842461, Jun 21 2012 Globus Medical, Inc Systems and methods of checking registrations for surgical systems
10846864, Jun 10 2015 VTouch Co., Ltd.; VTOUCH Method and apparatus for detecting gesture in user-based spatial coordinate system
10849580, Feb 03 2016 Globus Medical Inc. Portable medical imaging system
10860100, Feb 28 2010 Microsoft Technology Licensing, LLC AR glasses with predictive control of external device based on event input
10861190, Nov 26 2018 Capital One Services, LLC System and method for recalibrating an augmented reality experience using physical markers
10864057, Jan 18 2017 KB Medical, SA Universal instrument guide for robotic surgical systems, surgical instrument systems, and methods of their use
10866119, Mar 14 2016 Globus Medical, Inc.; Globus Medical, Inc Metal detector for detecting insertion of a surgical device into a hollow tube
10871570, Sep 14 2017 EVERYSIGHT LTD System and method for position and orientation tracking
10874466, Jun 21 2012 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
10878235, Feb 26 2015 CITIBANK, N A Apparatus for a near-eye display
10893912, Feb 16 2006 Globus Medical, Inc Surgical tool systems and methods
10895628, Dec 29 2016 HTC Corporation Tracking system, tracking device and tracking method
10898252, Nov 09 2017 Globus Medical, Inc Surgical robotic systems for bending surgical rods, and related methods and devices
10912617, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
10914949, Nov 16 2018 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
10925681, Jul 31 2015 Globus Medical Inc. Robot arm and methods of use
10939968, Feb 11 2014 Globus Medical Inc. Sterile handle for controlling a robotic surgical system from a sterile field
10939977, Nov 26 2018 AUGMEDICS LTD. Positioning marker
10942252, Dec 26 2016 HTC Corporation Tracking system and tracking method
10945742, Jul 14 2014 Globus Medical Inc. Anti-skid surgical instrument for use in preparing holes in bone tissue
10973594, Sep 14 2015 Globus Medical, Inc. Surgical robotic systems and methods thereof
11026756, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11027190, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
11039893, Oct 21 2016 Globus Medical, Inc. Robotic surgical systems
11045179, May 20 2019 Globus Medical, Inc Robot-mounted retractor system
11045267, Jun 21 2012 Globus Medical, Inc Surgical robotic automation with tracking markers
11052309, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
11058378, Feb 03 2016 Globus Medical, Inc. Portable medical imaging system
11062522, Feb 03 2015 Global Medical Inc Surgeon head-mounted display apparatuses
11066090, Oct 13 2015 Globus Medical, Inc Stabilizer wheel assembly and methods of use
11071594, Mar 16 2017 KB Medical SA Robotic navigation of robotic surgical systems
11073919, May 28 2004 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
11092812, Jun 08 2018 CITIBANK, N A Augmented reality viewer with automated surface selection placement and content orientation placement
11100668, Apr 09 2018 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
11103316, Dec 02 2014 KB Medical SA Robot assisted volume removal during surgery
11103317, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11109922, Jun 21 2012 Globus Medical, Inc. Surgical tool systems and method
11112862, Aug 02 2018 MAGIC LEAP, INC Viewing system with interpupillary distance compensation based on head motion
11116576, Mar 15 2013 Globus Medical, Inc Dynamic reference arrays and methods of use
11134862, Nov 10 2017 Globus Medical, Inc Methods of selecting surgical implants and related devices
11135015, Jul 21 2017 Globus Medical, Inc.; Globus Medical, Inc Robot surgical platform
11135022, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11153555, May 08 2020 Globus Medical, Inc Extended reality headset camera system for computer assisted navigation in surgery
11172997, Oct 04 2013 KB Medical, SA Apparatus and systems for precise guidance of surgical tools
11176750, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
11181974, Mar 07 2018 Magic Leap, Inc. Visual tracking of peripheral devices
11187923, Dec 20 2017 CITIBANK, N A Insert for augmented reality viewing device
11189252, Mar 15 2018 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
11191598, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11199713, Dec 30 2016 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
11200870, Jun 05 2018 Magic Leap, Inc. Homography transformation matrices based temperature calibration of a viewing system
11202681, Apr 01 2011 Globus Medical, Inc. Robotic system and method for spinal and other surgeries
11204491, May 30 2018 CITIBANK, N A Compact variable focus configurations
11205058, Mar 03 2004 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
11207150, Feb 19 2020 Globus Medical, Inc.; Globus Medical, Inc Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
11210808, Dec 29 2016 Magic Leap, Inc. Systems and methods for augmented reality
11216086, Aug 03 2018 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
11216978, Nov 26 2018 Capital One Services, LLC System and method for recalibrating an augmented reality experience using physical markers
11217028, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
11231770, Mar 28 2017 Magic Leap, Inc. Augmented reality system with spatialized audio tied to user manipulated virtual object
11253216, Apr 28 2020 Globus Medical, Inc Fixtures for fluoroscopic imaging systems and related navigation systems and methods
11253320, Jul 21 2017 Globus Medical Inc. Robot surgical platform
11253327, Jun 21 2012 Globus Medical, Inc Systems and methods for automatically changing an end-effector on a surgical robot
11266470, Feb 18 2015 KB Medical SA Systems and methods for performing minimally invasive spinal surgery with a robotic surgical system using a percutaneous technique
11266530, Mar 22 2018 Route guidance and obstacle avoidance system
11278360, Nov 16 2018 Globus Medical, Inc End-effectors for surgical robotic systems having sealed optical components
11278796, Apr 05 2002 MQ Gaming, LLC Methods and systems for providing personalized interactive entertainment
11280937, Dec 10 2017 CITIBANK, N A Anti-reflective coatings on optical waveguides
11284949, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11298196, Jun 21 2012 Globus Medical, Inc Surgical robotic automation with tracking markers and controlled tool advancement
11317971, Jun 21 2012 Globus Medical, Inc Systems and methods related to robotic guidance in surgery
11317973, Jun 09 2020 Globus Medical, Inc Camera tracking bar for computer assisted navigation during surgery
11317978, Mar 22 2019 Globus Medical, Inc System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
11331153, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11337742, Nov 05 2018 Globus Medical Inc Compliant orthopedic driver
11337769, Jul 31 2015 Globus Medical, Inc Robot arm and methods of use
11347960, Feb 26 2015 Magic Leap, Inc. Apparatus for a near-eye display
11357548, Nov 09 2017 Globus Medical, Inc Robotic rod benders and related mechanical and motor housings
11382549, Mar 22 2019 Globus Medical, Inc.; Globus Medical, Inc System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
11382666, Nov 09 2017 Globus Medical, Inc Methods providing bend plans for surgical rods and related controllers and computer program products
11382699, Feb 10 2020 Globus Medical, Inc Extended reality visualization of optical tool tracking volume for computer assisted navigation in surgery
11382700, May 08 2020 Globus Medical, Inc Extended reality headset tool tracking and control
11382713, Jun 16 2020 Globus Medical, Inc Navigated surgical system with eye to XR headset display calibration
11389252, Jun 15 2020 AUGMEDICS LTD. Rotating marker for image guided surgery
11392636, Oct 17 2013 NANT HOLDINGS IP, LLC Augmented reality position-based service, methods, and systems
11395706, Jun 21 2012 Globus Medical Inc. Surgical robot platform
11399758, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
11399900, Jun 21 2012 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
11402927, May 28 2004 UltimatePointer, L.L.C. Pointing device
11409376, May 28 2004 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
11416084, May 28 2004 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
11419616, Mar 22 2019 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
11425189, Feb 06 2019 MAGIC LEAP, INC Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors
11426178, Sep 27 2019 Globus Medical, Inc Systems and methods for navigating a pin guide driver
11439444, Jul 22 2021 Globus Medical, Inc. Screw tower and rod reduction tool
11439471, Jun 21 2012 Globus Medical, Inc. Surgical tool system and method
11445232, May 01 2019 Magic Leap, Inc. Content provisioning system and method
11452914, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
11461983, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
11464581, Jan 28 2020 Globus Medical, Inc Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
11486961, Jun 14 2019 INVENSENSE, INC Object-localization and tracking using ultrasonic pulses with reflection rejection
11510027, Jul 03 2018 MAGIC LEAP, INC Systems and methods for virtual and augmented reality
11510684, Oct 14 2019 Globus Medical, Inc.; Globus Medical, Inc Rotary motion passive end effector for surgical robots in orthopedic surgeries
11510750, May 08 2020 Globus Medical, Inc Leveraging two-dimensional digital imaging and communication in medicine imagery in three-dimensional extended reality applications
11514673, Jul 26 2019 MAGIC LEAP, INC Systems and methods for augmented reality
11521296, Nov 16 2018 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
11523784, Feb 03 2016 Globus Medical, Inc. Portable medical imaging system
11523785, Sep 24 2020 Globus Medical, Inc.; Globus Medical, Inc Increased cone beam computed tomography volume length without requiring stitching or longitudinal C-arm movement
11529195, Jan 18 2017 Globus Medical Inc. Robotic navigation of robotic surgical systems
11534179, Jul 14 2014 Globus Medical, Inc. Anti-skid surgical instrument for use in preparing holes in bone tissue
11567324, Jul 26 2017 Magic Leap, Inc. Exit pupil expander
11571171, Sep 24 2019 Globus Medical, Inc Compound curve cable chain
11571265, Mar 22 2019 Globus Medical Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
11579441, Jul 02 2018 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
11589771, Jun 21 2012 Globus Medical, Inc Method for recording probe movement and determining an extent of matter removed
11598651, Jul 24 2018 Magic Leap, Inc. Temperature dependent calibration of movement detection devices
11599257, Nov 12 2019 CAST GROUP OF COMPANIES INC.; CAST GROUP OF COMPANIES INC Electronic tracking device and charging apparatus
11600024, Nov 26 2018 Capital One Services, LLC System and method for recalibrating an augmented reality experience using physical markers
11602402, Dec 04 2018 Globus Medical, Inc Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
11607149, Jun 21 2012 Globus Medical Inc. Surgical tool systems and method
11609645, Aug 03 2018 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
11622794, Jul 22 2021 Globus Medical, Inc. Screw tower and rod reduction tool
11624929, Jul 24 2018 Magic Leap, Inc. Viewing device with dust seal integration
11625090, Mar 07 2018 Magic Leap, Inc. Visual tracking of peripheral devices
11628023, Jul 10 2019 Globus Medical, Inc Robotic navigational system for interbody implants
11628039, Apr 11 2016 Globus Medical Inc. Surgical tool systems and methods
11630507, Aug 02 2018 Magic Leap, Inc. Viewing system with interpupillary distance compensation based on head motion
11653856, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
11668588, Mar 14 2016 Globus Medical Inc. Metal detector for detecting insertion of a surgical device into a hollow tube
11672622, Jul 31 2015 Globus Medical, Inc. Robot arm and methods of use
11684431, Jun 21 2012 Globus Medical, Inc. Surgical robot platform
11684433, Jun 21 2012 Globus Medical Inc. Surgical tool systems and method
11690687, Jun 21 2012 Globus Medical Inc. Methods for performing medical procedures using a surgical robot
11690697, Feb 19 2020 Globus Medical, Inc. Displaying a virtual model of a planned instrument attachment to ensure correct selection of physical instrument attachment
11694355, Apr 09 2018 Globus Medical, Inc. Predictive visualization of medical imaging scanner component movement
11717185, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
11717350, Nov 24 2020 Globus Medical, Inc Methods for robotic assistance and navigation in spinal surgery and related systems
11734901, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
11737696, Mar 22 2019 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
11737766, Jan 15 2014 Globus Medical Inc. Notched apparatus for guidance of an insertable instrument along an axis during spinal surgery
11737831, Sep 02 2020 Globus Medical, Inc Surgical object tracking template generation for computer assisted navigation during surgical procedure
11737832, Nov 15 2019 MAGIC LEAP, INC Viewing system for use in a surgical environment
11744598, Mar 22 2019 Globus Medical, Inc. System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
11744648, Apr 01 2011 Globus Medicall, Inc. Robotic system and method for spinal and other surgeries
11744655, Dec 04 2018 Globus Medical, Inc Drill guide fixtures, cranial insertion fixtures, and related methods and robotic systems
11750794, Mar 24 2015 AUGMEDICS LTD. Combining video-based and optic-based augmented reality in a near eye display
11751927, Nov 05 2018 Globus Medical Inc. Compliant orthopedic driver
11751950, Aug 12 2015 Globus Medical Inc. Devices and methods for temporary mounting of parts to bone
11755127, May 28 2004 UltimatePointer, L.L.C. Multi-sensor device with an accelerometer for enabling user interaction through sound or image
11756335, Feb 26 2015 Magic Leap, Inc. Apparatus for a near-eye display
11762222, Dec 20 2017 Magic Leap, Inc. Insert for augmented reality viewing device
11762623, Mar 12 2019 Magic Leap, Inc. Registration of local content between first and second augmented reality viewers
11763531, Feb 03 2015 Globus Medical, Inc. Surgeon head-mounted display apparatuses
11766296, Nov 26 2018 AUGMEDICS LTD Tracking system for image-guided surgery
11771499, Jul 21 2017 Globus Medical Inc. Robot surgical platform
11776509, Mar 15 2018 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
11779408, Jan 18 2017 Globus Medical, Inc. Robotic navigation of robotic surgical systems
11786144, Nov 10 2017 Globus Medical, Inc. Methods of selecting surgical implants and related devices
11786324, Jun 21 2012 Globus Medical, Inc. Surgical robotic automation with tracking markers
11790554, Dec 29 2016 Magic Leap, Inc. Systems and methods for augmented reality
11793570, Jun 21 2012 Globus Medical, Inc Surgical robotic automation with tracking markers
11793583, Apr 24 2014 Globus Medical Inc. Surgical instrument holder for use with a robotic surgical system
11793588, Jul 23 2020 Globus Medical, Inc Sterile draping of robotic arms
11794338, Nov 09 2017 Globus Medical, Inc Robotic rod benders and related mechanical and motor housings
11801022, Feb 03 2016 Globus Medical, Inc. Portable medical imaging system
11801115, Dec 22 2019 AUGMEDICS LTD. Mirroring in image guided surgery
11806084, Mar 22 2019 Globus Medical, Inc System for neuronavigation registration and robotic trajectory guidance, and related methods and devices
11806100, Oct 21 2016 KB Medical, SA Robotic surgical systems
11813030, Mar 16 2017 Globus Medical, Inc. Robotic navigation of robotic surgical systems
11819283, Jun 21 2012 Globus Medical Inc. Systems and methods related to robotic guidance in surgery
11819324, Jan 09 2006 Nike, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
11819365, Jun 21 2012 Globus Medical, Inc. System and method for measuring depth of instrumentation
11829596, Nov 12 2019 CAST GROUP OF COMPANIES INC. Electronic tracking device and charging apparatus
11832863, Nov 05 2018 Globus Medical, Inc. Compliant orthopedic driver
11838493, May 08 2020 Globus Medical Inc. Extended reality headset camera system for computer assisted navigation in surgery
11839435, May 08 2020 Globus Medical, Inc. Extended reality headset tool tracking and control
11841997, Jul 13 2005 UltimatePointer, L.L.C. Apparatus for controlling contents of a computer-generated image using 3D measurements
11844532, Oct 14 2019 Globus Medical, Inc. Rotary motion passive end effector for surgical robots in orthopedic surgeries
11850009, Jul 06 2021 Globus Medical, Inc Ultrasonic robotic surgical navigation
11850012, Mar 22 2019 Globus Medical, Inc System for neuronavigation registration and robotic trajectory guidance, robotic surgery, and related methods and devices
11854153, Apr 08 2011 NANT HOLDINGS IP, LLC Interference based augmented reality hosting platforms
11856479, Jul 03 2018 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
11857149, Jun 21 2012 Globus Medical, Inc Surgical robotic systems with target trajectory deviation monitoring and related methods
11857266, Jun 21 2012 Globus Medical, Inc System for a surveillance marker in robotic-assisted surgery
11857273, Jul 06 2021 Globus Medical, Inc Ultrasonic robotic surgical navigation
11864745, Jun 21 2012 Globus Medical, Inc Surgical robotic system with retractor
11864839, Jun 21 2012 Globus Medical, Inc Methods of adjusting a virtual implant and related surgical navigation systems
11864857, Sep 27 2019 Globus Medical, Inc. Surgical robot with passive end effector
11869160, Apr 08 2011 NANT HOLDINGS IP, LLC Interference based augmented reality hosting platforms
11872000, Aug 31 2015 Globus Medical, Inc Robotic surgical systems and methods
11874468, Dec 30 2016 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
11877807, Jul 10 2020 Globus Medical, Inc Instruments for navigated orthopedic surgeries
11879959, May 13 2019 CAST GROUP OF COMPANIES INC. Electronic tracking device and related system
11883117, Jan 28 2020 Globus Medical, Inc. Pose measurement chaining for extended reality surgical navigation in visible and near infrared spectrums
11883217, Feb 03 2016 Globus Medical, Inc Portable medical imaging system and method
11885871, May 31 2018 MAGIC LEAP, INC Radar head pose localization
11890066, Sep 30 2019 Globus Medical, Inc Surgical robot with passive end effector
11890122, Sep 24 2020 Globus Medical, Inc. Increased cone beam computed tomography volume length without requiring stitching or longitudinal c-arm movement
11896363, Mar 15 2013 Globus Medical Inc. Surgical robot platform
11896445, Jul 07 2021 AUGMEDICS LTD. Iliac pin and adapter
11896446, Jun 21 2012 Globus Medical, Inc Surgical robotic automation with tracking markers
11908434, Mar 15 2018 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
11911112, Oct 27 2020 Globus Medical, Inc Robotic navigational system
11911115, Dec 20 2021 Globus Medical, Inc Flat panel registration fixture and method of using same
11911225, Jun 21 2012 Globus Medical Inc. Method and system for improving 2D-3D registration convergence
6934633, Oct 15 2004 The United States of America as represented by the Secretary of the Navy Helmet-mounted parachutist navigation system
6987257, Jan 28 2003 Honeywell International Inc. Attitude determination system and method
7034779, Aug 06 2002 Information Decision Technologies, LLC Advanced ruggedized augmented reality instrumented self contained breathing apparatus
7056216, Jun 11 1999 Canon Kabushiki Kaisha User interface apparatus, user interface method, game apparatus, and program storage medium
7110013, Mar 15 2000 Information Decision Technologies, LLC Augmented reality display integrated with self-contained breathing apparatus
7301648, Jan 28 2000 THALES VISIONIX, INC Self-referenced tracking
7355561, Sep 15 2003 United States of America as represented by the Secretary of the Army Systems and methods for providing images
7554511, Jun 19 2001 Device and a method for creating an environment for a creature
7602301, Jan 09 2006 NIKE, Inc Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
7610558, Feb 18 2002 Canon Kabushiki Kaisha Information processing apparatus and method
7672781, Jun 04 2005 Lord Corporation Miniaturized wireless inertial sensing system
7716008, Jan 19 2007 Nintendo Co., Ltd. Acceleration data processing program, and storage medium, and acceleration data processing apparatus for use with the same
7764178, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
7773074, Jun 28 2005 Siemens Medical Solutions USA, Inc. Medical diagnostic imaging three dimensional navigation device and methods
7774155, Mar 10 2006 NINTENDO CO , LTD Accelerometer-based controller
7786976, Mar 09 2006 Nintendo Co., Ltd. Coordinate calculating apparatus and coordinate calculating program
7815508, Sep 13 2006 Nintendo Co., Ltd. Game device and storage medium storing game program
7817162, Feb 11 2008 University of Northern Iowa Research Foundation Virtual blasting system for removal of coating and/or rust from a virtual surface
7821407, Jan 09 2006 NIKE, Inc Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
7825815, Jan 09 2006 NIKE, Inc Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
7839416, Mar 10 2006 University of Northern Iowa Research Foundation Virtual coatings application system
7839417, Mar 10 2006 University of Northern Iowa Research Foundation Virtual coatings application system
7877224, Mar 28 2006 Nintendo Co, Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
7889170, Jul 15 2004 Nippon Telegraph and Telephone Corporation Inner force sense presentation device, inner force sense presentation method, and inner force sense presentation program
7893840, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
7920071, Mar 08 2007 Harris Corporation Augmented reality-based system and method providing status and control of unmanned vehicles
7927216, Sep 15 2005 NINTENDO CO , LTD Video game system with wireless modular handheld controller
7931535, Aug 22 2005 NINTENDO CO , LTD Game operating device
7942745, Aug 22 2005 Nintendo Co., Ltd. Game operating device
7952483, Jul 29 2004 MOTIVA PATENTS, LLC Human movement measurement system
7978081, Jan 09 2006 NIKE, Inc Apparatus, systems, and methods for communicating biometric and biomechanical information
8041536, Mar 28 2006 Nintendo Co., Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
8063760, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
8089458, Feb 22 2000 MQ Gaming, LLC Toy devices and methods for providing an interactive play experience
8094090, Oct 19 2007 Southwest Research Institute Real-time self-visualization system
8157651, Sep 12 2005 Nintendo Co., Ltd. Information processing program
8159354, Jul 29 2004 MOTIVA PATENTS, LLC Human movement measurement system
8164567, Feb 22 2000 MQ Gaming, LLC Motion-sensitive game controller with optional display screen
8169406, Feb 22 2000 MQ Gaming, LLC Motion-sensitive wand controller for a game
8174366, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
8179604, Jul 13 2011 GOOGLE LLC Wearable marker for passive interaction
8184097, Feb 22 2000 MQ Gaming, LLC Interactive gaming system and method using motion-sensitive input device
8226493, Aug 01 2002 MQ Gaming, LLC Interactive play devices for water play attractions
8248367, Feb 22 2001 MQ Gaming, LLC Wireless gaming system combining both physical and virtual play elements
8267786, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
8308563, Aug 30 2005 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
8313379, Aug 25 2005 NINTENDO CO , LTD Video game system with wireless modular handheld controller
8368648, Feb 22 2000 MQ Gaming, LLC Portable interactive toy with radio frequency tracking device
8373659, Mar 25 2003 MQ Gaming, LLC Wirelessly-powered toy for gaming
8384668, Feb 22 2001 MQ Gaming, LLC Portable gaming device and gaming system combining both physical and virtual play elements
8409003, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
8427325, Jul 29 2004 MOTIVA PATENTS, LLC Human movement measurement system
8430753, Sep 15 2005 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
8467133, Feb 28 2010 Microsoft Technology Licensing, LLC See-through display with an optical assembly including a wedge-shaped illumination system
8472120, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses with a small scale image source
8473245, Mar 28 2006 Nintendo Co., Ltd. Inclination calculation apparatus and inclination calculation program, and game apparatus and game program
8475275, Feb 22 2000 MQ Gaming, LLC Interactive toys and games connecting physical and virtual play environments
8477425, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses including a partially reflective, partially transmitting optical element
8482859, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
8488246, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
8491389, Feb 22 2000 MQ Gaming, LLC Motion-sensitive input device and interactive gaming system
8506404, May 07 2007 SAMSUNG ELECTRONICS CO , LTD Wireless gaming method and wireless gaming-enabled mobile terminal
8531050, Feb 22 2000 MQ Gaming, LLC Wirelessly powered gaming device
8542717, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
8552869, Mar 03 2003 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
8608535, Apr 05 2002 MQ Gaming, LLC Systems and methods for providing an interactive game
8686579, Feb 22 2000 MQ Gaming, LLC Dual-range wireless controller
8702515, Apr 05 2002 MQ Gaming, LLC Multi-platform gaming system using RFID-tagged toys
8708821, Feb 22 2000 MQ Gaming, LLC Systems and methods for providing interactive game play
8708824, Sep 12 2005 Nintendo Co., Ltd. Information processing program
8711094, Feb 22 2001 MQ Gaming, LLC Portable gaming device and gaming system combining both physical and virtual play elements
8753165, Oct 20 2000 MQ Gaming, LLC Wireless toy systems and methods for interactive entertainment
8758136, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
8790180, Feb 22 2000 MQ Gaming, LLC Interactive game and associated wireless toy
8814688, Mar 25 2003 MQ Gaming, LLC Customizable toy for playing a wireless interactive game having both physical and virtual elements
8814691, Feb 28 2010 Microsoft Technology Licensing, LLC System and method for social networking gaming with an augmented reality
8827810, Apr 05 2002 MQ Gaming, LLC Methods for providing interactive entertainment
8834271, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
8870655, Aug 24 2005 Nintendo Co., Ltd. Wireless game controllers
8888576, Feb 26 1999 MQ Gaming, LLC Multi-media interactive play system
8913011, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
8915785, Feb 22 2000 MQ Gaming, LLC Interactive entertainment system
8947347, Aug 27 2003 SONY INTERACTIVE ENTERTAINMENT INC Controlling actions in a video game unit
8948279, Mar 03 2004 MEDICAL I P HOLDINGS, LP; LONE STAR SCM SYSTEMS, LP Interrogator and interrogation system employing the same
8961260, Oct 20 2000 MQ Gaming, LLC Toy incorporating RFID tracking device
8961312, Mar 25 2003 MQ Gaming, LLC Motion-sensitive controller and associated gaming applications
9011248, Aug 22 2005 Nintendo Co., Ltd. Game operating device
9035774, Apr 11 2011 Lone Star IP Holdings, LP Interrogator and system employing the same
9039533, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
9044671, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
9067097, Apr 10 2009 SOVOZ, INC Virtual locomotion controller apparatus and methods
9091851, Feb 28 2010 Microsoft Technology Licensing, LLC Light control in head mounted displays
9097890, Feb 28 2010 Microsoft Technology Licensing, LLC Grating in a light transmissive illumination system for see-through near-eye display glasses
9097891, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
9128281, Sep 14 2010 Microsoft Technology Licensing, LLC Eyepiece with uniformly illuminated reflective display
9129295, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
9134534, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses including a modular image source
9135669, Sep 29 2005 Lone Star IP Holdings, LP Interrogation system employing prior knowledge about an object to discern an identity thereof
9149717, Feb 22 2000 MQ Gaming, LLC Dual-range wireless interactive entertainment device
9162148, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
9182596, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
9186585, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
9223134, Feb 28 2010 Microsoft Technology Licensing, LLC Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
9227138, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
9229227, Feb 28 2010 Microsoft Technology Licensing, LLC See-through near-eye display glasses with a light transmissive wedge shaped illumination system
9230171, Jan 06 2012 GOOGLE LLC Object outlining to initiate a visual search
9272206, Apr 05 2002 MQ Gaming, LLC System and method for playing an interactive game
9285589, Feb 28 2010 Microsoft Technology Licensing, LLC AR glasses with event and sensor triggered control of AR eyepiece applications
9320976, Oct 20 2000 MQ Gaming, LLC Wireless toy systems and methods for interactive entertainment
9323055, May 26 2006 Harris Corporation System and method to display maintenance and operational instructions of an apparatus using augmented reality
9324229, Mar 08 2007 Harris Corporation System and method to display maintenance and operational instructions of an apparatus using augmented reality
9329689, Feb 28 2010 Microsoft Technology Licensing, LLC Method and apparatus for biometric data capture
9341843, Dec 30 2011 Microsoft Technology Licensing, LLC See-through near-eye display glasses with a small scale image source
9366862, Feb 28 2010 Microsoft Technology Licensing, LLC System and method for delivering content to a group of see-through near eye display eyepieces
9393491, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
9393500, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
9427659, Jul 29 2004 MOTIVA PATENTS, LLC Human movement measurement system
9446319, Mar 25 2003 MQ Gaming, LLC Interactive gaming toy
9463380, Apr 05 2002 MQ Gaming, LLC System and method for playing an interactive game
9468854, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
9470787, Apr 11 2011 Lone Star IP Holdings, LP Interrogator and system employing the same
9474962, Feb 22 2000 MQ Gaming, LLC Interactive entertainment system
9480929, Oct 20 2000 MQ Gaming, LLC Toy incorporating RFID tag
9498709, Aug 24 2005 Nintendo Co., Ltd. Game controller and game system
9498728, Aug 22 2005 Nintendo Co., Ltd. Game operating device
9536354, Jan 06 2012 GOOGLE LLC Object outlining to initiate a visual search
9579568, Feb 22 2000 MQ Gaming, LLC Dual-range wireless interactive entertainment device
9616334, Apr 05 2002 MQ Gaming, LLC Multi-platform gaming system using RFID-tagged toys
9658473, Oct 07 2005 Percept Technologies Inc Enhanced optical and perceptual digital eyewear
9675878, Sep 29 2004 MQ Gaming, LLC System and method for playing a virtual game by sensing physical movements
9682320, Jul 27 2002 SONY INTERACTIVE ENTERTAINMENT INC Inertially trackable hand-held controller
9700806, Aug 22 2005 Nintendo Co., Ltd. Game operating device
9704350, Mar 14 2013 HARMONIX MUSIC SYSTEMS, INC Musical combat game
9707478, Mar 25 2003 MQ Gaming, LLC Motion-sensitive controller and associated gaming applications
9713766, Feb 22 2000 MQ Gaming, LLC Dual-range wireless interactive entertainment device
9731194, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
9737797, Feb 22 2001 MQ Gaming, LLC Wireless entertainment device, system, and method
9759917, Feb 28 2010 Microsoft Technology Licensing, LLC AR glasses with event and sensor triggered AR eyepiece interface to external devices
9761196, Sep 22 2011 Seiko Epson Corporation Head-mount display apparatus
9770652, Mar 25 2003 MQ Gaming, LLC Wireless interactive game having both physical and virtual elements
9814973, Feb 22 2000 MQ Gaming, LLC Interactive entertainment system
9861887, Feb 26 1999 MQ Gaming, LLC Multi-platform gaming systems and methods
9875406, Feb 28 2010 Microsoft Technology Licensing, LLC Adjustable extension for temple arm
9907997, Jan 09 2006 NIKE, Inc Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
9931578, Oct 20 2000 MQ Gaming, LLC Toy incorporating RFID tag
9987554, Mar 14 2014 SONY INTERACTIVE ENTERTAINMENT INC Gaming device with volumetric sensing
9993724, Mar 25 2003 MQ Gaming, LLC Interactive gaming toy
RE45905, Sep 15 2005 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
Patent Priority Assignee Title
5645077, Jun 16 1994 Massachusetts Institute of Technology Inertial orientation tracker apparatus having automatic drift compensation for tracking human head and other similarly sized body
5812257, Nov 29 1990 VPL NEWCO, INC Absolute position tracker
5850201, Nov 30 1990 Sun Microsystems, Inc Low cost virtual reality system
5856844, Sep 19 1996 Omniplanar, Inc Method and apparatus for determining position and orientation
6124838, Nov 30 1990 Sun Microsystems, Inc. Hood-shaped support frame for a low cost virtual reality system
6172657, Feb 26 1996 Seiko Epson Corporation Body mount-type information display apparatus and display method using the same
//////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jan 26 2001InterSense, Inc.(assignment on the face of the patent)
Jul 16 2001FOXLIN, ERIC INTERSENSE, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0120650263 pdf
Nov 18 2011INTERSENSE INCORPORATEDGC NORTH ACQUISITION, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0273470300 pdf
Nov 22 2011GC NORTH ACQUISITION, LLCInterSense, LLCCHANGE OF NAME SEE DOCUMENT FOR DETAILS 0284690864 pdf
Jun 28 2012InterSense, LLCINDIGO TECHNOLOGIES, LLCASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0284690444 pdf
Dec 28 2012INDIGO TECHNOLOGIES, LLCTHALES VISIONIX, INC ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0305660711 pdf
Date Maintenance Fee Events
Sep 11 2007ASPN: Payor Number Assigned.
Dec 31 2007M2551: Payment of Maintenance Fee, 4th Yr, Small Entity.
Jan 07 2008REM: Maintenance Fee Reminder Mailed.
Feb 13 2012REM: Maintenance Fee Reminder Mailed.
Feb 27 2012M2552: Payment of Maintenance Fee, 8th Yr, Small Entity.
Feb 27 2012M2555: 7.5 yr surcharge - late pmt w/in 6 mo, Small Entity.
Dec 16 2015M2553: Payment of Maintenance Fee, 12th Yr, Small Entity.
Sep 12 2019BIG: Entity status set to Undiscounted (note the period is included in the code).


Date Maintenance Schedule
Jun 29 20074 years fee payment window open
Dec 29 20076 months grace period start (w surcharge)
Jun 29 2008patent expiry (for year 4)
Jun 29 20102 years to revive unintentionally abandoned end. (for year 4)
Jun 29 20118 years fee payment window open
Dec 29 20116 months grace period start (w surcharge)
Jun 29 2012patent expiry (for year 8)
Jun 29 20142 years to revive unintentionally abandoned end. (for year 8)
Jun 29 201512 years fee payment window open
Dec 29 20156 months grace period start (w surcharge)
Jun 29 2016patent expiry (for year 12)
Jun 29 20182 years to revive unintentionally abandoned end. (for year 12)