A head mounted display system for use with a mobile computing device, comprises a soft main body made entirely of a soft and compressible material, the main body has a retention pocket entirely formed by the material and configured to accept and secure the mobile computing device and a lens assembly comprising two lenses configured to focus vision on respective areas of a display screen of the mobile computing device, the lens assembly held within one or more apertures formed in the main body entirely by the material, the two lenses mounted for independent movement with respect to each other, such that a split screen image may be viewed through the two lenses on the display screen.
|
1. A head mounted display system for use with a mobile computing device, comprising:
a single-piece soft main body made entirely of a homogeneous unitary soft and compressible material without an internal or external structural rigid skeleton, the soft main body configured to be worn on a human head completely covering eyes of the human head and providing a seal around the exterior of the eyes blocking out substantially all exterior illumination, wherein the main body further has a retention pocket disposed in front of the eyes entirely formed by the material and configured to accept and secure the mobile computing device therein; and
a lens assembly comprising two lenses configured to focus vision on respective areas of a display screen of the mobile computing device, the lens assembly held within one or more apertures formed in the main body entirely by the material, the two lenses mounted for independent movement with respect to each other, such that a split screen image may be viewed through the two lenses on the display screen.
25. A head mounted display system for use with the mobile computing device, comprising:
a main body made entirely of a homogeneous unitary soft and compressible material formed as goggles with a front face, two side walls, a top wall, and a bottom wall extending rearwardly from the front face, and an internal cavity defined within the walls, the goggles configured to be worn on a human head so as to conform to a wearer's face a contoured lip that conforms to the temples of the wearer and prevents light from entering the internal cavity of the goggles from the rear;
a retention pocket defined by the main body and made entirely of the material configured to accept and secure a mobile computing device therein, the retention pocket being positioned just rearward from the front face of the main body; and
two separate lens assemblies movably mounted within the internal cavity just rearward from the retention pocket, each lens assembly comprising a lens configured to focus vision of the wearer on respective portions of the display so at to generate a stereoscopic three-dimensional image to the wearer, each lens assembly having portions that extend outward of both the top and bottom walls of the main body enabling the wearer to manually displace the respective lens assemblies.
13. A head mounted display system for use with the mobile computing device, comprising:
a single-piece main body made entirely of a homogeneous unitary soft and compressible material formed as goggles with a front face, two side walls, top wall, and a bottom wall each wall extending rearwardly from the front face, a rear-facing contact lip including a nose bridge making up a part of the bottom wall, and an internal cavity defined within the walls, the main body being made of a material that may be molded into the goggle shape and when solidified, is soft and compressible the goggles configured to be worn on a human head,
wherein the internal cavity defines a retention pocket entirely formed by the material and configured to accept and secure a mobile computing device therein, the retention pocket being positioned just rearward from the front face of the main body within the internal cavity and having at least two projections formed by the material spaced evenly across a central vertical plane through the goggles that contact top and bottom ends of the mobile computing device and laterally center the mobile computing device when inserted into the retention pocket; and
two separate lens assemblies movably mounted within the internal cavity each comprising a lens configured to focus vision on a respective area of a display screen of the mobile computing device, such that a split screen image may be viewed through the two lenses on the display screen.
3. The system of
4. The system of
5. The system of
6. The system of
7. The system of
8. The system of
9. The system of
10. The system of
11. The system of
12. The system of
15. The system of
16. The system of
17. The system of
18. The system of
19. The system of
20. The system of
21. The system of
22. The system of
23. The system of
24. The system of
26. The system of
27. The system of
28. The system of
29. The system of
30. The system of
31. The system of
32. The system of
33. The system of
|
A portion of the disclosure of this patent document contains material which is subject to copyright protection. This patent document may show and/or describe matter which is or may become trade dress of the owner. The copyright and trade dress owner has no objection to the facsimile reproduction by anyone of the patent disclosure as it appears in the Patent and Trademark Office patent files or records, but otherwise reserves all copyright and trade dress rights whatsoever.
The present application claims priority under 35 U.S.C. §119 to U.S. Provisional Application Ser. Nos. 62/060,996, filed Oct. 7, 2014, and 61/941,294, filed Feb. 18, 2014, both entitled “Mobile Virtual and Augmented Reality System and Use,” the contents of which are expressly incorporated herein by reference.
This disclosure relates generally to wearable computers, and more specifically to goggles which receive a mobile computing device such as a smartphone to provide a mobile virtual and augmented reality system, whereby a user can experience and control virtual reality (VR), augmented reality (AR), and stereoscopic experiences, such as three dimensional (3D) and 360° movies and computer games.
Any discussion of the prior art throughout this specification should in no way be considered as an admission that such prior art is publically known or forms part of common general knowledge in the field.
In the 1960s, Ivan Sutherland presented a virtual 3D world to users using an early vector cathode ray tube (CRT) head mounted display. Tracking was performed by a set of either mechanical or ultrasonic sensors. A general purpose computer processed the tracking data, while a special purpose graphics processor made the appropriate perspective transforms on scene data. Sutherland wrote, “No available general-purpose computer would be fast enough to become intimately involved in the perspective computations required for dynamic perspective display.”
Since that time, the graphics hardware industry has grown and matured. With the rise of the video game industry, there is now a commoditized marketplace for high performance graphics chipsets. Such chipsets enable almost any general-purpose computer to run 3D game engines and allow these machines to “intimately” participate in real-time perspective display. These chipsets are now in mobile computing devices, such as current smartphones, bringing 3D game engines to these smaller devices.
Head mounted displays (HMDs) have provided gateways into various augmented and virtual realities, and have been used in many industries in addition to gaming as a means of allowing hands free and immersive viewing of computer generated and filmed (e.g., 360° cameras) content. However, these displays were typically manufactured in low volumes, were built for a customer base of researchers and niche application developers, and cost thousands, if not tens of thousands, of dollars. There have been some steps towards commodity virtual reality displays for gaming, such as the Nintendo Virtual Boy™, but these products have been commercially unsuccessful. A variety of relatively low cost mobile HMDs (MHMDs) have been available in the $1000 and lower price point, beginning with models such as the Sony Glasstron™, Virtual I/O iGlasses™, and continuing with some models today.
There is a need for a more ergonomic and user-friendly system for MHMDs that leverage the sophistication and capabilities of current mobile computing devices.
Various embodiments of the invention are described below with reference to the accompanying diagrammatic drawings, in which:
The present application provides an ergonomic and user-friendly head mounted display for producing virtual reality (VR), augmented reality (AR), and stereoscopic experiences, such as three dimensional (3D) and 360° movies and games. The head mounted display includes soft goggles that conform to a wearer's face and include a slot for receiving and retaining a mobile computing device, such as a smartphone. A pair of lenses adjustably mounted within the goggles provide a stereoscopic image of the display of the smartphone within the goggles. One or two remote controls may be mounted to the goggles for additional functionality.
The term “head mounted display” or HMD refers to any apparatus that can be mounted on the head to provide the wearer a personal viewing experience. Illustrated embodiments include goggles that are strapped around the back of the head and have a main body which receives a mobile computing device therein. Although a HMD can be relatively cumbersome, each of the HMDs described herein are relatively lightweight and portable, and thus are referred to as mobile head mounted displays, or MHMDs.
The term “mobile computing device” refers to a portable unit with an internal processor/memory and a display screen, such as a smartphone. Mobile computing devices can be smartphones, cellular telephones, tablet computers, netbooks, notebooks, personal data assistants (PDAs), multimedia Internet enabled cellular telephones, and similar personal electronic devices that include a programmable processor/memory and display screen. Such mobile computing devices are typically configured to communicate with a mobile bandwidth provider or wireless communication network and have a web browser. Many mobile computing devices also include a rear-facing camera which provides additional functionality when coupled with the MHMDs of the present application.
In the exemplary head mounted display HMD shown in
As noted, a strap 40 may be used to securely attach the main body to the user's head, as illustrated in
The exemplary mobile computing device 50 as seen in
As also indicated, in the exemplary embodiment shown, the main body 10 has a Velcro™ element 11 to allow the re-attachment of the remote controller 30 as shown in
Additional or alternative mechanisms as the frames 50, 51 are envisioned that allow for similar functionality, such as, for example, the use of an internal frame that operates as a toaster-like mechanism to allow the mobile computing device to be inserted into the main body and click into place, wherein another push allows the device to be released. Furthermore, one or more internal frames may be provided, such as one to define a pocket to retain the mobile computing device and another to define channels within which are mounted the lens assembly 20.
The lens assembly is located between the user 70 and mobile computing device screen 52, as illustrated in
The image on mobile computing device screen L is the left portion of the stereoscopic image, while mobile computing device screen R is the right portion of the stereoscopic image. Video content which is stereoscopic may be downloaded to the mobile computing device 50 to allow a person to perceive the images through the lenses 21a, 21b as one single three-dimensional image. Alternatively, stereoscopic display software or apps may be downloaded to the mobile computing device 50 and used to convert any single image into one which is stereoscopic. Stereoscopic viewing allows creation of virtual reality (VR), augmented reality (AR), 360 video, as well as 3D video.
In the case of a wireless connection, the application running on the mobile computing device 50 may use a method of detecting one or more controllers and determining if the application can or should connect to the controllers based on the distance from the mobile device 50 using the signal strength of the remote controller. Alternatively, physical interaction with between the mobile computing device (or HMD) and a controller (e.g. pressing or holding down a button) may signal that they should attempt to communicate with one another. In addition the application running on the device may connect to multiple controllers and provide distinct functionality to each controller connected. In addition the application running on the mobile device 50 may provide a means of storing a record of controllers connected so that the system can ignore other controllers if needed, e.g., may be configured to store such a record in the memory of the mobile device 50.
In addition, in some embodiments, the remote controller 30 may be equipped with one or more motion sensing elements, e.g., one or more sensors for detecting movement, acceleration, orientation, and so forth, referred to herein generally as “motion detection.” Thus, for example, in some embodiments, the remote controller may include one or more motion detection chip(s), e.g., 9-axis motion detection chips, although other numbers of motion-related axes may be used as desired. The remote controller 30 may communicate its current motion state (which may include orientation) to the mobile computing device 50 according to some specified criteria, e.g., at a specified frequency, e.g., one or more times per second, or when the motion state changes, e.g., by a specified amount. When the remote controller 30 is attached to the main body 10, the application running on the mobile device 50 may be able to determine the starting position and orientation of the remote controller 30 in relation to the main body 10 or mobile device 50. This information may be used to track the position and orientation of the remote controller with greater accuracy. When the motion data from the remote controller 30 is used in a simulation that uses a human armature, the motion can be computationally mapped to the constraints of the human form, thus providing a method of using the remote controller 30 as a virtual hand and gesture device with high accuracy in terms of the relation to the user's own hand.
When using a fiducial marker 60, the mobile computing device's camera 54 (see
This process is illustrated in
If a marker is detected, then the marker's position and rotation are detected at 106. Because each face of the fiducial marker (each, a marker in themselves) is distinct, the computer vision software can determine the distance (relative position to the MHMD camera) and, thus the location in free space, and the rotation based upon the angle of the markers presented to the camera.
Next, the visualization engine (e.g. virtual reality or augmented realty software) provides a real-time stream of data (either game data for VR applications or a video captured by the MHMD camera for augmented reality) to the wearer with a “virtual” item interspersed within that data as oriented, located, and rotated by the user based upon the fiducial marker data observed.
The lens assembly 20 as illustrated in
Because the distance from CPa or CPb to the center of each respective lens 21a and 21b is known, the IPD may be derived therefrom. The conductive material, thus, provides a contact point with substantial accuracy (e.g. typing on a capacitive mobile device screen) to enable the mobile device screen 52 to be adequately calibrated based upon the IPD derived therefrom.
Here, as shown in
If the position is unchanged from the last known position (or a beginning default position) at 1002, then the process returns to the beginning to await a change. If the position is changed at 1002, then a new lens position is calculated at 1003 based upon the known distance (and angle) of the center of the respective lens 21a, 21b, and the (x,y) location of the stylus.
Finally, the virtual reality or augmented reality software (or driver) re-computes any changes to the data displayed on the mobile computing device screen 52 at 1004. This may mean that the images shown on the mobile computing device screen 52 should be shown further apart or closer together or with a larger “black” or “darkened” gap between the two images in order to ensure that the images presented properly converge to a user wearing the MHMD given the updated (IPD). Failure to do so may make a wearer cross-eyed, give a wearer headaches, cause a wearer to feel dizzy, or otherwise degrade the experience of the MHMD wearer.
The capability to dynamically detect these positions is necessary in the present application because there is no standardized hardware (or IPD) being employed. In situations in which a single screen size is used for all software (i.e. the Oculus VR, Inc., Oculus Rift headset) then the IPD may be pre-set (as it was in the first version of the RIFT) regardless of the wearer. Without adjusting for IPD, the focal point of the wearer may be incorrectly calibrated relative to the images being displayed.
Here, in a situation in which the lenses 21a and 21b are moveable for the comfort of the wearer, determining the IPD is an important part of providing a quality experience to the user. The introduction of variable screen sizes, because many different types and sizes of mobile devices may be used in the present MHMD, only complicates things further.
Other methods for calculating IPD may also be employed including, incorporating a set of “wheels” or “gears” to enable the lenses to be dynamically moved by a wearer, while set within an MHMD, while simultaneously tracking the specific rotation of those wheels or gears such that IPD may derived from the current orientation of the wheels or gears. Similarly, a backwards-facing camera (including one built into a mobile device 50 that faces the same direction as the mobile computing device screen 52 may be capable, in conjunction with suitable software, of detecting the location of one or both lenses 21a, 21b based upon fiducial markers, visual markers or other elements interposed on the face of any lens assembly 20.
Turning to
Preferably, the main body 10 may be entirely or primarily formed from a durable foam material. This material provides flexibility, especially to flex inward for smaller heads and spread apart for larger heads, as illustrated in
The use of foam material to construct the main body (and/or other portions of the apparatus) may allow one or more of the components described above to be omitted or replaced, where the foam material itself provides the functionality of the omitted components. Said another way, the foam construction may provide the functionality described above with respect to one or more of these components, and so the component as a separate piece of the apparatus may be omitted. Said in yet another way, the components and/or their functionality may be implemented by the foam material construction, e.g., of the main body, thus rendering the use of separate and distinct components for these functions unnecessary.
For example, the use of foam material allows for the omission or replacement of (separate) external frame 19 as described in
As another example, the use of foam material allows for the omission or replacement of (separate) components 51a, 51b and 51c of the internal frame 51, as described in
As yet a further example, the use of foam material allows for the omission or replacement of (separate) components of the lens frame 28, as described in
The main body 10 may have a unibody construction, i.e., the main body may be a single piece of foam material.
Note that other materials such as rubber, plastic, or combination of materials and structure such as an interior frame wrapped with less dense foam covered in a fabric mesh, may also be used as desired.
Exemplary Method of Use
The user 70 may run (execute) a system compatible application on the mobile computing device 50. In some embodiments, once the application has loaded and following any set-up steps required by the application, the user may insert the mobile computing device 50 into the slot 18 of the main body 10, or into the mobile computing device frame 19 and then into the slot 18 of the main body, or otherwise incorporate the mobile computing device into the system. The user may then affix the system to his/her head by positioning the main body 10 in front of their eyes, much like wearing a pair of goggles or glasses. The user may then position the strap 40 around their head so that the main body 10 is secured to the user's head. The user may now see the mobile computing device 50 (or more specifically, the screen thereof) through the lens assembly 20, where the lens assembly may allow each eye to see only a discrete (respective) portion of the mobile computing device screen 52, which allows for a 3D or stereoscopic viewing experience. Alternatively, the user may don the main body, then insert or attach the mobile computing device.
Depending on the application, the user may use the remote controller 30 to interact with the application via controller motion and/or button presses. The remote controller 30 may send information to the mobile computing device 50, which may expose (or communicate) the information to the (system compatible) application, where the information may be programmatically used to interact with the application. The types of applications envisioned include augmented reality, virtual reality, and 3D media type applications; however the use of the system for other types of applications is contemplated and expected, and dependent on the application.
For example, in one exemplary case of a virtual reality application, the user may be (virtually) placed in a virtual environment where the application may display a stereoscopic image of the virtual environment onto the mobile computing device screen 52. In the case where the mobile computing device contains motion sensors, the movement of the device may be interpreted in the virtual world as controlling a virtual camera mimicking or tracking the motion of the user's head. This may allow the user to see into the virtual world and look around as if the user were actually there.
In cases of computer vision applications, the device camera 54 may be used to identify fiducial markers. For example, the application running on the device may utilize computer vision to “see” (and recognize) a fiducial marker of or on a viewed item in the camera video feed. Once a fiducial marker is detected, a virtual object may be displayed on top of (or overlaid on) the stereoscopic video, to the effect that the virtual object is presented in the real world at scale, rotation, and position, relative to the user. The user may then interact with the object with the remote controller 30 or through movement.
The user may fit the remote controller 30 with a fiducial marker 60 to allow detection of the remote controller in the camera field of view (FOV).
The main body 10 may be used as, or configured with, a fiducial marker.
In some embodiments, toys or other physical objects may be used as markers.
Computer vision algorithms running on or in the application may make use of point clouds or natural features detection to determine the position, location, and/or size of objects in the physical world, and the user may move or position themselves relative to these objects.
If, on the other hand, the point cloud is not known (in step 402), then as indicated in step 405, if dynamic object creation is not implemented or enabled, the method may return to the beginning, as shown. Alternatively, if dynamic object creation is implemented or enabled, then in step 406 corresponding physical objects may be determined, and virtual objects matching the real (physical) objects may be dynamically generated, as indicated in step 407.
Radio signals may be used for relative or absolute positioning among MHMDs.
In the use case of viewing 3D media, the user may load media content or an application that displays the media in a side by side format (e.g., in a stereoscopic format). The user may then view the media through the lens assembly 20 and may optionally use headphones 80, thus creating a 3D media experience.
Additionally, many more experiences are contemplated that do not fall under one of the above categories. The use of the mobile computing device 50 features not mentioned herein and many features that may be available in future mobile computing devices may enable developers to create applications and experiences for the system that are not listed above.
Note that the remote controller 30 illustrated in
As an alternative,
As mentioned above, the type of mobile computing device may vary depending on the size of the vertical pocket 510. For example, pocket 510 may accommodate modern smartphones or maybe larger to accommodate tablet computers. The term “smartphone” will be used hereafter in place of “mobile computing device.”
As described previously, the goggles 500 are preferably retained on a person's head using retention straps. For example, a rear strap 514 extends around the backside of a wearer's head, as seen in
In a second mode of operation, seen in
With reference back to the perspective views of
Because of the softness and pliability of the material of the main body 502, the inner anchor portions 526 of each of the docking clips 520 can be pushed through vertical slots 528 formed in the side walls 518 until the anchor portions are past the slots and within the interior cavity 508 of the main body 502. That is, the narrow neck portion 522 has a horizontal length that is substantially the same as the thickness of the side walls 518 such that the clips 520 are held firmly with respect to the main body 502. This is seen best in the horizontal section view of
The docking clips 520 may be clips of another form entirely or may use other attachment structures. For example, in place of the docking clips 520 Velcro®, adhesive pads, locking mechanisms, latches, grommets, magnets and other, similar, attachment structures may be used. The use of docking clips 520 is only the preferred option. Still further, a concave depression shaped like the back face of the remote control 512 may be formed in one or both side walls 518 of the main body so as to closely receive the remote and reduce its outward profile extending outside of the soft body. This latter solution helps reduce movement of the remote control 512 relative to the main body, thus reducing the chance of detachment from head movement.
In this regard, the soft main body 502 is a comfortable “face feel” making it more tolerable to wear the goggles 500 for a longer period of time and enabling the entire main body 502 to conform around a wearer's face. Furthermore, a preferred foam material makes the main body 502 extremely light weight, and the weight of the other components such as the lens assemblies 506 and remotes 512 are kept down so that the goggles 500 are easy to wear for long periods of time. Preferably, the goggles 500 have a maximum weight of about 150-230 gf with the head strap and lenses (but without the remotes 512), though certain foam formulations may reduce that further.
The material of the soft main body 502 is preferably a soft flexible foam, more preferably a closed-cell foam or a so-called “Integral Skin” foam. The formulation of the foam material may vary, and includes Ethylene-vinyl acetate (EVA), Polyurethane (PU), and HE foam. Each of these alone or in various combinations may be utilized. It should be understood, however, that any material that can be molded into the shape of the main body 502 may be used, and though foam is preferred it is not the exclusive option. The main preference is the ability to mold the material into shape such that when it is molding is complete; the material is soft, impermeable, and compressible. In addition, the material may be soft to the touch, and because the entire main body 502 is formed of the material, the entire main body 502 is soft to the touch. The material may have a relatively high tensile strength to resist wear and tearing. Some prior head mounted goggles utilize separate pieces of injection-molded plastic coupled together which are brittle and, as a result, tend to break at the seams/junctions.
In a preferred embodiment, the entire main body 502 is formed of a single, homogeneous unitary foam member which may be injection molded, pour molded, or cold-form molded. The advantages of having a single unitary foam member include low manufacturing cost because there is only a single mold and no assembly of components required, and structural integrity because there is less opportunity for breakage at joints or seems between multiple different parts. The molded foam manufacturing technique accommodates complex internal shapes (e.g., slots for lens assemblies, nose bridge), and permits the inclusion of ancillary parts such as the strap anchors, either by being molded into the goggles or with the provision of shaped recesses and the like. Molding permits the interior walls to provide an appealing “face feel” and any desired texturing (to aid in grip of the face as well as comfort). The use of a foam “hunibody” also allows for distinct outer shapes to be easily produced without affecting the mechanical functionality of the main body 502, thus allowing custom physical designs of the goggles that have a distinct look and feel to be easily manufactured. Finally, multiple colors and designs may easily be incorporated into the foam, including branding or advertising on any of the generally flat outer surfaces of the main body 502.
Alternatively, the main body 502 may be formed of an inner structural “skeleton” of sorts covered by a molded soft foam. In this embodiment, an internal portion or skeleton of the main body 502 is first molded with a higher density foam, or other plastic, and then the various internal and external contours of the main body 502 are formed by molding the softer foam around the skeleton. Although there are essentially two components of this type of body 502, because they are molded together into one piece they may also be referred to as a unitary foam member. In other words, once molded there is no need for attaching pieces together to form the body 502. Still further, the aforementioned internal frames 50, 51 or other internal components may be formed by inserts of material that is less compressible than the softer foam. For instance, inserts or frames may be combined with a soft foam body to define the retention pocket 510 or channels within which the lens assemblies 506 slide.
Furthermore, the use of a closed-cell or other water-resistant foam promotes hygiene and permits the main body 502 to be easily cleaned. That is, ancillary components such as the lens assemblies 506 and the remote controls 512 may be removed and a water-resistant foam body 502 may be wiped down or even immersed in water for cleaning. Foam types that are water-resistant, at least more so than open cell foams, include closed cell foams and Integral Skin foams. The latter includes an outer substantially non-porous skin formed during the mold process against the mold surface. Other materials that have been used are incapable of being easily disassembled or tend to absorb contaminants, whereas the closed-cell foam provides an exterior barrier to such contamination. In a further embodiment, the material may be seeded or coated with an antimicrobial chemical to kill bacteria.
With reference to
Now with reference to
The rear face of the front wall 530 is generally flat and vertical, but includes a pair of relatively large ramped protrusions 544 projecting rearward from into the pocket 510. These protrusions 544 are located toward the top of the pocket 510 and are largest on their outer extents so as to contact and force both ends of the smartphone inward. That is, if the device is inserted off-center, the protrusions 544 tend to center the device. Furthermore, a plurality of smaller friction bumpers or nubs 546 also project rearward from the front wall 530 into the pocket 510. These nubs 546 are generally evenly distributed in two rows at the top and the bottom of the slot, as seen in
The smartphone inserts in the pocket 510 between the rear face of the front wall 530 and in front of an internal divider wall 548 that extends parallel to the front wall, and is seen best in
As an additional precaution to retain the smartphone within the pocket 510, a pair of inward ledges 554 are formed at the top end of the slot, as seen in
As seen in
As was described above, the goggles 500 provide a system for detecting and communicating to the smartphone 572 the individual lens horizontal and vertical positions within the headset. This establishes the interpupillary distance (IPD). One means for automatically determining interpupillary distance is to take advantage of the capacitive touch features of the mobile device screen in conjunction with a stylus 594 attached to each lens assembly 506.
Capacitive touch screens, such as on smartphones, have varying sensitivities, and a response may be triggered in some from a simple touch from an inanimate and non-conductive object. A conductive path is not required if the stylus material conductive properties allow for the touch response to be triggered. However, this may create a problem with buildup of charge in the material, and is may be impeded by the different sensitivities of smartphone capacitive screens. Nevertheless, this is considered a viable method of transferring touch inputs without the need of a conductive path. More commonly, an electrical current such as directly or indirectly from a user's fingertip is necessary, or at least the use of a stylus with a magnet or some form of ferrous material in its tip. The present application contemplates styluses integrated within the MHMD goggles that transmit a touch and initiate a touch response on capacitive touch screens regardless of the means. Thus, the term “touch input” encompasses all such configurations.
The significance of touching the smartphone screen can be to locate the lens assembly 600, thus setting the IPD distance. Alternatively, the ability to touch the smartphone screen can be utilized as a, button, switch or prompt to make a decision with regard to software running in the smartphone. For example, the first time a wearer puts on the goggles, the smartphone may initiate an IPD calibration, wherein the wearer positions the lens assemblies 600 to his or her specification and initiates the stylus touch. Subsequently, the smartphone software may require inputs which can be translated through the stylus touch. For example, a number of YES or NO options can be presented to the wearer, wherein one touch means YES and two touches means NO (or a right side touch means YES and a left side touch means NO). Of course, there are numerous other possibilities of such communication. Furthermore, as mentioned above, there may be more than one pair of touch styluses provided for the goggles which may allow for one dedicated pair (which may or may not be in constant contact with the screen of an inserted smartphone) for IPD calibration and one or more other pairs for communicating decisions. Indeed, the use of two or more inputs greatly enhances the user experience, much as a two button mouse is greatly superior to a single button mouse for interacting with a computer.
As was described above, the remote controllers 512 may include one or more 9-axis motion detection chip(s), although other numbers of motion-related axes may be used as desired. The remote controllers 512 may communicate its current motion state (which may include orientation) to the smartphone 572 at a specified frequency, e.g., one or more times per second, or when the motion state changes, e.g., by a specified amount.
The ability to attach and detach as well as positionally dock the controllers 572 to the main body 502 enables the user to easily keep track of the controller. While docked to the side of the main body 502, the controllers 512 can also be used in situations where the user would not need to utilize the full features of the controller, as depicted in
Furthermore, once the remote controllers 512 are docked onto the known position on the sides of the goggle main body 502, the system can then use the motion data from the controllers to track the user's head while it is in motion. When docked, software on the smartphone 572 knows the orientation of the remote controller 512 based upon the docking configuration (e.g. the remote controller 512 only docks in one position on the goggles). The data generated by the remote controller 512 may be provided in place of or in addition to data derived directly by a smartphone.
In addition, the docking mechanism can mechanically activate the headtracking mode on the controller. For example, the bumps 574 on the sides of the goggle, under or near the docking clips 520 may depress the docking switches 576 (see
Similarly, other methods of detection may be employed in place of the docking switches 576. Infrared, camera-based, light-based, magnetic, capacitive, proximity sensors and other systems used by the smartphone and/or remote controller 512 may be used in order to detect that the remote controller 512 has been docked with the goggles. For example, a capacitive sensor may be exposed in a recess within the main body 502 such that, when the remote controller 512 is docked, a small capacitive stylus touches the capacitive sensor thereby indicating that the remote controller 512 is docked. Similarly, infrared, camera-based, light-based, or proximity sensors may be employed to note when the remote views a particular light pattern, repeating light, light color, or similar indicator emitted by the smartphone and/or main body 502 (e.g. through a particular recess in the side of the main body 502 that corresponds to a counterpart sensor in a remote controller 512) in order to determine that the remote controller 512 is docked. Attachment to a magnet may close an exposed circuit on the main body 502 that indicates that the remote controller 512 is attached to the main body 502. Also, the controller 512 may include a mail USB jack that inserts into a female port provide in the side of the body 502, which signals that the controller is docked and also provide a convenient means for data or power transfer. These and various other docking detection methods may be employed.
Once docked, and once recognized by either or both of the remote controller 512 and the smartphone 572, the remote controllers may provide orientation, location, motion, and rotation data to the smartphone. The sensors or the integrated motion detection chip within the remote controllers 512 may be purpose-built for generating motion-related data. As a result of the increased use of motion-controllers (such as in the Wii and, now Wii U) and smartphones use of gyroscopes to determine screen orientation, direction and the like, there are now very powerful integrated chips that are capable of quickly providing and calculating device orientation, movement, and rotation. However, in order to save costs the most powerful integrated chips are seldom integrated into smartphones. Instead, only those sensors that provide some benefit, and only to the level that they provide that benefit, are typically incorporated into smartphones.
Because that very detailed data pertaining to orientation, location, movement, and rotation is desirable in a high-quality motion-detecting remote control, like remote controller 512, the integrated chips chosen for integration into the remote controller 512 can be of the best, most cost-effective quality. These chips can include (or have access to and algorithms related to) one or more gyroscopes, gravitometers, compasses, magnetometers, cameras (both infrared and video) and other, similar, sensors used for determining orientation, location, movement and rotation. Collectively, these are called “motion sensors” within this application. Further, because the remote control in the present application may be used in conjunction with a standard smartphone which is not designed to perform such detailed calculations in order to provide head-tracking data, the remote control provides an opportunity to offload some of that functionality at substantially reduced cost. The data generated by one or more of these remote controllers 512 may be extremely accurate, quickly generated, and transmitted to a smartphone for action thereon. The remote controller 512 is shown as a remote control device, but may instead be a fixed or detachable device including motion sensors and a processor that is only used in conjunction with the headset to augment the motion sensing capability of a smartphone. Herein, these types of devices are also called remote controllers.
The process 200 shown in
In some cases, the remote may also be used to perform sensor fusion in addition to providing raw sensor data or updated motion information to a smartphone 572. In such cases, the remote's integrated chips may obtain all location, motion, and rotation data and perform so-called “sensor fusion” to integrate that data into a current location, motion, and rotation. That data may be handed off directly to the smartphone for use in rendering the current (or future) frames of video. Based upon that raw data, the remote controller 512 may also perform predictive functions on the location, motion, and rotation data to thereby suggest future location, motion, and rotation of the goggle.
The remote controller 512 may perform motion sensor fusion in place of or in addition to motion sensors in the smartphone 572, wherein the controller takes over some of the work for the smartphone. By relieving the smartphone 572 of most tasks related to obtaining orientation, motion and rotation data, the smartphone apply its processing power to processor-intensive video rendering applications based upon the data provided by the remote.
Desirably, the remote controllers 512 may both equipped with a camera 704 to provide additional video stream to the device used in conjunction with computer vision algorithms. The additional cameras 704 can be used in conjunction with the camera on the smartphone 572 to provide a stereo image of the environment. Providing even one controller 512 on a side of the main body 502 supplies an additional video stream, thereby furthering enhancing the capabilities of the computer vision algorithms by enabling the cameras of the smartphone 572 and remote control 12 to work in conjunction to provide a stereo image of the external environment. Even more cameras, one on two, mounted remote controls 512 and the smartphone 572 camera, may provide still more accuracy. The cameras 704 on the controllers 512 may be RGB camera, depth cameras or simply BW or UV cameras.
The detachable controllers 512 are also used to establish relative location of the system's motion components. Specifically, knowledge of the location and orientation of the controllers 512 allows the system to calibrate the locations of the various motion components relative to each other. Furthermore the system can then use the default positions and orientations to provide positional and rotational offsets relative to the default, thus allowing the system to track the motion of the components relative to one another. This may, for example, act as a “reset” when motion tracking algorithms go awry. For example, the user may apply and remove a controller 512 from his or her head to reset the motion tracking algorithm from a known starting point. This is useful when the user removes the remote controller 512 from the headset with their hand, the system can then track the controller motion and apply that to a virtual rig of a human skeletal structure and compute the user's virtual hand position based on the real world hand position.
Another configuration for the main body of the image in the goggles of the present application is in a collapsible form. For example, the various walls of the main body 502 illustrated above with respect to
Closing Comments
Throughout this description, the embodiments and examples shown should be considered as exemplars, rather than limitations on the apparatus and procedures disclosed or claimed. Although many of the examples presented herein involve specific combinations of method acts or system elements, it should be understood that those acts and those elements may be combined in other ways to accomplish the same objectives. Acts, elements and features discussed only in connection with one embodiment are not intended to be excluded from a similar role in other embodiments.
As used herein, “plurality” means two or more. As used herein, a “set” of items may include one or more of such items. As used herein, whether in the written description or the claims, the terms “comprising”, “including”, “carrying”, “having”, “containing”, “involving”, and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of” and “consisting essentially of”, respectively, are closed or semi-closed transitional phrases with respect to claims. Use of ordinal terms such as “first”, “second”, “third”, etc., in the claims to modify a claim element does not by itself connote any priority, precedence, or order of one claim element over another or the temporal order in which acts of a method are performed, but are used merely as labels to distinguish one claim element having a certain name from another element having a same name (but for use of the ordinal term) to distinguish the claim elements. As used herein, “and/or” means that the listed items are alternatives, but the alternatives also include any combination of the listed items.
Patent | Priority | Assignee | Title |
10042169, | Apr 29 2015 | BEIJING PICO TECHNOLOGY CO , LTD | Miniature dust resistant projection device |
10353212, | Jan 11 2017 | Samsung Electronics Co., Ltd. | See-through type display apparatus and method of operating the same |
10488830, | Aug 10 2016 | Intel Corporation | Automatic adjustment of head mounted display straps |
10528199, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
10565723, | Feb 02 2016 | Samsung Electronics Co., Ltd.; Samsung Electronics | Systems and methods for drift correction |
10579162, | Mar 24 2016 | Samsung Electronics Co., Ltd.; Samsung Electronics | Systems and methods to correct a vehicle induced change of direction |
10592048, | May 17 2016 | GOOGLE LLC | Auto-aligner for virtual reality display |
10606085, | Nov 22 2017 | Reality glasses headset device | |
10620817, | Jan 13 2017 | International Business Machines Corporation | Providing augmented reality links to stored files |
10684654, | Nov 30 2015 | Samsung Electronics Co., Ltd | Head-mounted display device with detachable device |
10712791, | Sep 13 2019 | Microsoft Technology Licensing, LLC | Photovoltaic powered thermal management for wearable electronic devices |
10722800, | May 16 2016 | GOOGLE LLC | Co-presence handling in virtual reality |
10736244, | Sep 13 2019 | Microsoft Technology Licensing, LLC | Wearable electronic devices having multiple layers of electromagnetic spectrum specific paint for enhanced thermal performance |
10823971, | Sep 14 2018 | Apple Inc. | Dynamic seals for adjustable display devices |
10867445, | Nov 16 2016 | Amazon Technologies, Inc. | Content segmentation and navigation |
11042130, | Aug 10 2016 | Intel Corporation | Automatic adjustment of head mounted display straps |
11093000, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
11093001, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
11115512, | Dec 12 2020 | Smartphone cases with integrated electronic binoculars | |
11253181, | Aug 03 2018 | From Zero, LLC | Method for objectively tracking and analyzing the social and emotional activity of a patient |
11360317, | Sep 14 2018 | Apple Inc. | Dynamic seals for adjustable display devices |
11449099, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
11493991, | Nov 16 2018 | MIRA LABS, INC | Reflective lens headset configuration detection |
11508249, | Mar 05 2018 | International, Inc. | Secure testing using a smartphone |
11720145, | Jan 17 2019 | Apple Inc. | Head-mounted display and facial interface thereof |
11740475, | Jan 17 2019 | Apple Inc. | Head-mounted display with facial interface for sensing physiological conditions |
11744495, | Aug 03 2018 | From Zero, LLC | Method for objectively tracking and analyzing the social and emotional activity of a patient |
11762421, | Jan 17 2019 | Apple Inc. | Head-mounted display and facial interface thereof |
11774768, | Sep 14 2017 | Apple Inc. | Face seal for head-mounted display |
11782480, | Jan 17 2019 | Apple Inc. | Head-mounted display with changeable center of gravity |
9535254, | Dec 31 2014 | META PLATFORMS TECHNOLOGIES, LLC | Overmolded LEDs and fabric in virtual reality headsets |
9723117, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
9811184, | Jul 16 2014 | DDC TECHNOLOGY, LLC | Virtual reality viewer and input mechanism |
9857839, | Aug 23 2016 | META PLATFORMS TECHNOLOGIES, LLC | Housing assembly with membrane-covered frame |
9909740, | Dec 31 2014 | META PLATFORMS TECHNOLOGIES, LLC | Overmolded LEDs and fabric in virtual reality headsets |
Patent | Priority | Assignee | Title |
4707863, | Jan 24 1983 | FIRST SECURITY BANK OF IDAHO, N A | Anti-fog goggle with foam frame |
5128671, | Apr 12 1990 | VAC ACQUISITION CORP ; Vought Aircraft Company | Control device having multiple degrees of freedom |
5422684, | Apr 23 1992 | Protective eyewear with retractable protective shields | |
5440326, | Mar 21 1990 | Thomson Licensing | Gyroscopic pointer |
5495576, | Jan 11 1993 | INTELLECTUAL VENTURS FUND 59 LLC; INTELLECTUAL VENTURES FUND 59 LLC | Panoramic image based virtual reality/telepresence audio-visual system and method |
5696521, | Jun 22 1994 | ASTOUNDING TECHNOLOGIES M SDN BHD | Video headset |
5759044, | Feb 22 1990 | Redmond Productions | Methods and apparatus for generating and processing synthetic and absolute real time environments |
6038707, | Jan 23 1998 | Smith Sport Optics; SMITH | Sports goggle having a ventilating fan |
6067192, | Dec 28 1995 | Portable viewing apparatus having multiple interchargeable optical modules | |
6144672, | Dec 22 1997 | Datamedia | Data switching system |
6150998, | Dec 30 1994 | Vuzix Corporation | Headset for presenting video and audio signals to a wearer |
6234446, | Jan 16 1997 | Personal audio/video entertainment system | |
6400364, | May 29 1997 | Canon Kabushiki Kaisha | Image processing system |
6522531, | Oct 25 2000 | Apparatus and method for using a wearable personal computer | |
6665885, | Feb 22 2000 | YAMAMOTO KOGAKU CO , LTD | Goggles |
6931668, | Dec 21 2001 | Harris Corporation | Headmount apparatus for attaching and supporting devices |
7002551, | Sep 25 2002 | HRL Laboratories, LLC | Optical see-through augmented reality modified-scale display |
7158118, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices with orientation compensation and improved usability |
7173604, | Mar 23 2004 | Fujitsu Limited | Gesture identification of controlled devices |
7239301, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices and methods |
7414611, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices with orientation compensation and improved usability |
7489298, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices and methods |
7542210, | Jun 29 2006 | TEDDER, DONALD RAY | Eye tracking head mounted display |
7667962, | Aug 20 2004 | Wireless devices with flexible monitors and keyboards | |
8072424, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices with orientation compensation and improved usability |
8137195, | Nov 23 2004 | DRNC HOLDINGS, INC | Semantic gaming and application transformation |
8180411, | Feb 08 2009 | Sony Ericsson Mobile Communications AB | Injection molded solid mobile phone, machine, and method |
8212859, | Oct 13 2006 | Apple Inc | Peripheral treatment for head-mounted displays |
8237657, | Apr 30 2004 | DRNC HOLDINGS, INC | Methods and devices for removing unintentional movement in 3D pointing devices |
8549415, | May 04 2007 | Apple Inc. | Automatically adjusting media display in a personal display system |
8581841, | Nov 22 2010 | DRNC HOLDINGS, INC | 3D pointing device with up-down-left-right mode switching and integrated swipe detector |
8605008, | May 04 2007 | Apple Inc | Head-mounted display |
8723699, | Nov 09 2010 | Google Technology Holdings LLC | Method and apparatus for controlling a device |
8758021, | Dec 28 2000 | Remote internet technical guidance/education distribution system using practitioner's vision, and guidance system using communication network | |
8766917, | Apr 30 2004 | DRNC HOLDINGS, INC | 3D pointing devices and methods |
8795079, | Nov 23 2004 | DRNC HOLDINGS, INC | Semantic gaming and application transformation including movement processing equations based on inertia |
8831255, | Mar 08 2012 | Disney Enterprises, Inc. | Augmented reality (AR) audio with position and action triggered virtual sound effects |
8881316, | Mar 19 2010 | Oakley, Inc | Eyewear with rigid lens support |
8928635, | Jun 22 2011 | Apple Inc. | Active stylus |
8957835, | Sep 30 2008 | Apple Inc.; Apple Inc | Head-mounted display apparatus for retaining a portable electronic device with display |
9063351, | Sep 28 2012 | Verily Life Sciences LLC | Input detection system |
20040008157, | |||
20050231532, | |||
20070064311, | |||
20080015017, | |||
20080024594, | |||
20080122736, | |||
20080129957, | |||
20090093761, | |||
20100079356, | |||
20110194029, | |||
20110261452, | |||
20110304577, | |||
20130093788, | |||
20130124039, | |||
20130250236, | |||
20130335573, | |||
20140008157, | |||
20140064536, | |||
20140085257, | |||
20140098009, | |||
20140152531, | |||
20140159995, | |||
20140218269, | |||
20150009102, | |||
20150138645, | |||
20150234193, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 17 2015 | LYONS, FRANKLIN A | MERGE LABS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 035021 | /0014 | |
Feb 17 2015 | LYONS, FRANKLIN A | MERGE LABS, INC | CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 14 625,691 PREVIOUSLY RECORDED AT REEL: 035021 FRAME: 0014 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT OF ASSIGNOR S INTEREST | 035571 | /0245 | |
Feb 18 2015 | Merge Labs, Inc. | (assignment on the face of the patent) | / | |||
May 06 2016 | MERGE LABS, INC | MERGE LABS, INC | ENTITY CONVERSION | 043590 | /0448 | |
Oct 19 2017 | MERGE LABS, INC | MERGE FUNDING PARTNERSHIP | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 043906 | /0020 |
Date | Maintenance Fee Events |
Oct 21 2019 | REM: Maintenance Fee Reminder Mailed. |
Apr 06 2020 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Mar 01 2019 | 4 years fee payment window open |
Sep 01 2019 | 6 months grace period start (w surcharge) |
Mar 01 2020 | patent expiry (for year 4) |
Mar 01 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 01 2023 | 8 years fee payment window open |
Sep 01 2023 | 6 months grace period start (w surcharge) |
Mar 01 2024 | patent expiry (for year 8) |
Mar 01 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 01 2027 | 12 years fee payment window open |
Sep 01 2027 | 6 months grace period start (w surcharge) |
Mar 01 2028 | patent expiry (for year 12) |
Mar 01 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |