An optical conveyance system is disclosed for relocating a visual input position of an electronic device. The electronic device has a frontside camera and a backside camera respectively positioned on opposite sides of the electronic device. An optical receiver is configured to receive visual input. An optical output is positioned over the frontside camera of the electronic device and is configured to project visual output into the frontside camera of the electronic device. An optical conveyance device is optically connected between the optical receiver and the optical output. The optical conveyance device is configured to convey the visual input received through the optical receiver to the optical output for projection as the visual output into the frontside camera of the electronic device.
|
14. A method for providing stereoscopic vision within a head-mounted display device, the head-mounted display device having an electronic device installed within the head-mounted display device, the electronic device having a display screen and a frontside camera on a front side of the electronic device and a backside camera on a back side of the electronic device, the head-mounted display device configured to optically present images displayed on the display screen of the electronic device as immersive visual content, comprising:
receiving a first visual input through a first optical receiver integrated within the head-mounted display, the first optical receiver having a first optical view axis substantially aligned with a first eye of a user when the head-mounted display is worn by the user;
receiving a second visual input through a second optical receiver integrated within the head-mounted display, the second optical receiver having a second optical view axis substantially aligned with a second eye of the user when the head-mounted display is worn by the user;
optically conveying the first visual input through a first optical conveyance device to a first optical output positioned over the frontside camera of the electronic device so that the first visual input is projected into the frontside camera of the electronic device, the first optical conveyance device and the first optical output integrated within the head-mounted display;
optically conveying the second visual input through a second optical conveyance device to a second optical output positioned over the backside camera of the electronic device so that the second visual input is projected into the backside camera of the electronic device, the second optical conveyance device and the second optical output integrated within the head-mounted display;
processing the first visual input projected into the frontside camera in conjunction with the second visual input projected into the backside camera to provide stereoscopic vision within a real world environment corresponding to the combined fields of view of the first optical receiver and the second optical receiver.
1. A head-mounted display device including an optical conveyance system for relocating visual input positions of an electronic device, the electronic device having a frontside camera and a backside camera respectively positioned on opposite sides of the electronic device, comprising:
a first optical receiver integrated within the head-mounted display, the first optical receiver configured to receive a first visual input, the first optical receiver having a first optical view axis substantially aligned with a first eye of a user when the head-mounted display is worn by the user;
a first optical output integrated within the head-mounted display, the first optical output positioned over the frontside camera of the electronic device, the first optical output configured to project a first visual output into the frontside camera of the electronic device;
a first optical conveyance device integrated within the head-mounted display, the first optical conveyance device optically connected between the first optical receiver and the first optical output, the first optical conveyance device configured to convey the first visual input received through the first optical receiver to the first optical output for projection as the first visual output into the frontside camera of the electronic device;
a second optical receiver integrated within the head-mounted display, the second optical receiver configured to receive a second visual input, the second optical receiver having a second optical view axis substantially aligned with a second eye of the user when the head-mounted display is worn by the user;
a second optical output integrated within the head-mounted display, the second optical output positioned over the backside camera of the electronic device, the second optical output configured to project a second visual output into the backside camera of the electronic device; and
a second optical conveyance device integrated within the head-mounted display, the second optical conveyance device optically connected between the second optical receiver and the second optical output, the second optical conveyance device configured to convey the second visual input received through the second optical receiver to the second optical output for projection as the second visual output into the backside camera of the electronic device.
2. The head-mounted display device as recited in
3. The head-mounted display device as recited in
4. The head-mounted display device as recited in
5. The head-mounted display device as recited in
wherein the second combination of optical components is configured such that the second visual output projected into the backside camera of the electronic device is equivalent to the second visual input received through the second optical receiver as if the backside camera of the electronic device were positioned and oriented in a same manner as the second optical receiver.
6. The head-mounted display device as recited in
7. The head-mounted display device as recited in
8. The head-mounted display device as recited in
9. The head-mounted display device as recited in
10. The head-mounted display device as recited in
11. The head-mounted display device as recited in
12. The head-mounted display device as recited in
13. The head-mounted display device as recited in
wherein the second optical receiver is positioned to have a field of view that overlaps a field of view of the first optical receiver.
15. The method as recited in
16. The method as recited in
17. The method as recited in
adjusting a separation distance measured perpendicularly between the first optical view axis of the first optical receiver and the second optical view axis of the second optical receiver.
18. The method as recited in
using the stereoscopic vision within the real world environment corresponding to the combined fields of view of the first optical receiver and the second optical receiver to identify objects within the real world environment.
19. The method as recited in
using the stereoscopic vision within the real world environment corresponding to the combined fields of view of the first optical receiver and the second optical receiver to determine positions and distances to objects identified within the real world environment.
20. The method as recited in
notifying the user of the head-mounted display device of one or more of the identification of objects within the real world environment, the position of objects within the real world environment, and the distance to objects within the real world environment.
21. The method as recited in
using the stereoscopic vision within the real world environment corresponding to the combined fields of view of the first optical receiver and the second optical receiver for tracking of objects within the real world environment as the head-mounted display device is moved; and
using the tracking of objects within the real world environment as the head-mounted display device is moved to assist with navigation of a point of view of the user of the head-mounted display within the immersive visual content.
|
This application claims priority under 35 U.S.C. 119(e) to U.S. Provisional Patent Application No. 62/403,009, filed Sep. 30, 2016. The disclosure of the above-identified patent application is incorporated herein by reference in its entirety.
The computing industry and the video game industry have seen many changes over the years. As computing power has expanded, developers of video games have created game software adapted to the increased computing power. To this end, video game developers have been coding games that incorporate sophisticated operations and mathematics to produce a very realistic game experience.
These games are presented as part of a gaming system including game consoles, portable game devices, and/or provided as services over a server or the cloud. As is well known, the game console is designed to connect to a monitor (usually a television) and enable user interaction through handheld controllers/input devices. A game console may include specialized processing hardware, including a central processing unit (CPU), a graphics processing unit (GPU) for processing intensive graphics operations, a vector unit for performing geometric transformations, and other glue hardware, firmware, and software. The game console may be further designed with an optical disc tray for receiving game compact discs for local play through the game console. Online and multi-player gaming is also possible, where a user can interactively play against or with other users over the Internet. As game complexity continues to intrigue players, game and hardware manufacturers have continued to innovate to enable additional and more realistic interactivity.
A growing trend in the computer gaming industry is to develop games that increase the interaction between the user and the gaming system. One way of accomplishing a richer interactive experience is to use wireless game controllers whose movement and gestures are tracked by the gaming system. These movements and gestures are used as inputs for the game. Gesture inputs, generally speaking, refer to having an electronic device such as a computing system, video game console, smart appliance, etc., react to some gesture made by the user while playing the game that are captured by the electronic device.
Another way of accomplishing a more immersive interactive experience is to use a head-mounted display (HMD). The HMD is worn by the user and can be configured to present various graphics, such as a view of a virtual space, in a display portion of the HMD. The graphics presented within the HMD can cover a large portion or even all of a user's field of view. Hence, the HMD can provide an immersive experience to the user. As connectivity to the Internet continues to increase, more configurations of HMD systems have been introduced.
The HMD can also be used in a virtual reality system in which a user becomes visually immersed in a computer generated three-dimensional virtual reality scene. In some applications, the entire virtual reality scene as displayed to the user is computer generated. In other applications, a portion of the virtual reality scene is computer generated, with another portion of the virtual reality scene corresponding to video and/or images of real-life objects and/or persons, where such real-life video/images can be rendered in the virtual reality scene in essentially real-time. Such applications may be referred to augmented reality applications. In some virtual reality applications, it is not only desirable to have the user feel visually immersed in the virtual reality scene, but it is also desirable to provide the user with an ability to select objects displayed within the virtual reality scene. It is within this context that the present invention arises.
In an example embodiment, an optical conveyance system for relocating a visual input position of an electronic device is disclosed. The electronic device has a frontside camera and a backside camera respectively positioned on opposite sides of the electronic device. The optical conveyance system includes an optical receiver configured to receive visual input. The optical conveyance system also includes an optical output configured to project visual output. The optical output is positioned over the frontside camera of the electronic device to project visual output into the frontside camera of the electronic device. The optical conveyance system also includes an optical conveyance device optically connected between the optical receiver and the optical output. The optical conveyance device is configured to convey the visual input received through the optical receiver to the optical output for projection as the visual output into the frontside camera of the electronic device.
In an example embodiment, a method is disclosed for providing stereoscopic vision within a head-mounted display device. An electronic device is installed within the head-mounted display device. The electronic device has a display screen and a frontside camera on a front side of the electronic device and a backside camera on a back side of the electronic device. The head-mounted display device is configured to optically present images displayed on the display screen of the electronic device as immersive visual content. The method includes receiving a first visual input through the backside camera of the electronic device. The method also includes receiving a second visual input through an optical receiver positioned to have a field of view that overlaps a field of view of the backside camera of the electronic device. The method also includes optically conveying the second visual input through an optical conveyance device to an optical output positioned over the frontside camera of the electronic device so that the second visual input is projected into the frontside camera of the electronic device. The method also includes processing the first visual input received through the backside camera in conjunction with the second visual input, as received through the optical receiver and projected into the frontside camera, to provide stereoscopic vision within a real world environment corresponding to the combined fields of view of the backside camera and optical receiver.
Other aspects of the invention will become more apparent from the following detailed description, taken in conjunction with the accompanying drawings, illustrating by way of example the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some or all of these specific details. In other instances, well known process operations have not been described in detail in order not to unnecessarily obscure the present invention.
The following detailed description includes references to the accompanying drawings, which form a part of the detailed description. The drawings show illustrations in accordance with example embodiments. These example embodiments, which are also referred to herein as “examples,” are described in enough detail to enable those skilled in the art to practice the presented subject matter. The embodiments can be combined, other embodiments can be utilized, or structural, logical, and electrical changes can be made without departing from the scope of what is claimed. The following detailed description is therefore not to be taken in a limiting sense, and the scope is defined by the appended claims and their equivalents. In this document, the terms “a” and “an” are used, as is common in patent documents, to include one or more than one. In this document, the term “or” is used to refer to a nonexclusive “or,” such that “A or B” includes “A but not B,” “B but not A,” and “A and B,” unless otherwise indicated.
In some embodiments, a head mounted display device (HMD) is provided, which includes a slot or holder for receiving an electronic device, such as a smartphone or similar electronic device, so as to allow the screen of the electronic device to be the display device of the HMD. It should be understood that the electronic device for insertion into the HMD can be any type of electronic device that has a display screen for displaying visual content, and that is equipped to receive and/or generate content for display on the display screen, and that has a form factor suitable for insertion of the electronic device into the slot/holder of the HMD. In some embodiments, the electronic device referred to herein has wireless communication capability (e.g., telephone, cellular, Wi-Fi, Bluetooth, etc.). However, in some embodiments, the electronic device referred to herein may not have wireless communication capability.
The slot/holder 111 of the HMD 110 is configured to receive and hold the electronic device 100. The slot/holder 111 is also configured to include an opening/cutout 119 to provide a clear view for the backside camera 105 of the electronic device 100 when the electronic device 100 is inserted into the slot/holder 111. In some embodiments, the electronic device 100 can be inserted into the slot/holder 111 as indicated by arrow 113. However, it should be understood that in other embodiments, the slot/holder 111 can be configured to receive the electronic device 100 from essentially any direction of insertion. The HMD 110 itself does not include a display screen, but instead uses the display screen 101 of the electronic device 100, when the electronic device 100 is inserted or placed in the slot/holder 111 of the HMD 110. The HMD 110 can also include optics 115 for enabling viewing of the content rendered on the display screen 101 of the electronic device 100. It should be understood that the HMD 110 is configured to be worn over the eyes of its user, and in some embodiments can include a cutout 117 for positioning the HMD 110 on the user's nose. Also, the HMD 110 can include a strap or band or other type of device (not shown) for securing the HMD 110 to the user's head.
The electronic device 100 may be connected to electronics within the HMD 110 through wired and/or wireless connection(s). In some embodiments, the electronic device 100 can communicate with the Internet to access content, such as by streaming content or by downloading content, which can be rendered on the electronic device 100 when the electronic device 100 is inserted into the HMD 110. The HMD 110 can also include electronics for communicating with the electronic device 100 through wired and/or wireless connection(s).
In some embodiments, the content rendered on the display screen 101 of the electronic device 100 is distorted to produce three-dimensional (3D) images, and the optics 115 of the HMD 110 are configured to un-distort the 3D images so that the content displayed by the electronic device 100, when viewed through the optics 115 of the HMD 110, appear to be rich 3D visual content, such as images, videos, interactive data, etc. In some embodiments, the visual content rendered on the display screen 101 of the electronic device 100, when inserted into the HMD 110, includes images that can be processed by circuitry/algorithms within the HMD 110 to appear as 3D images.
The content rendered by the electronic device 100 when present within the HMD 110 can be for essentially any type of computer application, and may include one or more types of content such as game, movie, audio, images, multimedia, among others. In some embodiments, the content, or portions thereof, is generated by one or more applications executing on the electronic device 100. However, in some embodiments, the content, or portions thereof, is streamed from a remote content source over a network to the electronic device 100. And, in some embodiments, the content, or portions thereof, is streamed from a cloud gaming infrastructure over a network to the electronic device 100. The cloud gaming infrastructure may also direct various types of content to be transmitted from the remote content source over a network to the electronic device 100.
An example remote content source is an Internet website that provides downloadable content and/or streaming content. The content provided by the remote content source can include any type of multimedia content, such as movies, games, static/dynamic content, pictures, social media content, social media websites, etc. In some embodiments, content data is transmitted from the remote content sources to the electronic device 100, where the content data is then rendered by the electronic device 100 in a format suitable for display within the HMD 110.
In some embodiments, the HMD 110 is configured to provide a view into an interactive virtual reality scene of a computer application. For example, some computer applications that may support virtual reality scene generation and display through the HMD 110 include games (such as first person shooter games), virtual tours (such as hotels, travel sites, global placed of interest, augmented reality applications (such as for virtual meetings, collaboration between remote users, shared/synchronized virtual spaces), and augmented reality medical applications (such as remote examination, examination assistance, remote surgery, remote surgery assistance), among others. In the various computer applications, a user wearing the HMD 110 with the electronic device 100 present therein will be able to move their head in any direction to view other parts of the virtual reality scene. And, in the case of an interactive virtual reality scene, movement of the HMD 110 by way of movement of the user's head can be used to provide inputs to control movement of the user and/or other objects within the virtual reality scene, and/or take other actions within the virtual reality scene, such as zooming a view of the user in and out relative to an object present within the virtual reality scene.
Because the interactive content that can be rendered in the virtual reality scene in the HMD 110 is virtually boundless, a user is able to view and interact with the virtual reality scene in most every dimension. Tracking of the user's movement can include use of inertial sensors disposed within the HMD 110 and/or use of inertial sensors disposed within the electronic device 100. The inertial sensors can include one or more accelerometers (such as a MEMS inertial accelerometer, among others) and/or one or more gyroscopes (such as a ring laser gyroscope, a fiber optic gyroscope, a MEMS gyroscope, among others). Some implementations of the HMD 110 may include more or less inertial sensors. And, some implementations of the HMD 110 may not include any inertial sensors within the HMD 110 itself.
For ease of description, the term “inertial sensor” as used herein refers to any type of inertial sensor that is capable of detecting/sensing movement of itself without an external reference. The inertial sensor generates inertial sensor data that provides information about the direction and rate of movement of the inertial sensor. The inertial sensor data can be analyzed to determine the direction and rate of movement of the HMD 110 and/or of the electronic device 100 present within the HMD 110, which in turn can be analyzed to determine the direction and rate of movement of the user wearing the HMD 110. In this manner, movements of the user as determined through analysis of the inertial sensor data can be used as inputs to the computer application executing to generate and render the virtual reality scene.
Therefore, through analysis of the inertial sensor data, the user is able to act as a human controller to affect specific actions within the interactive virtual reality scene. And, in some embodiments, the movements of the user and corresponding actions within the virtual reality scene can be naturally related to each other. For example, inertial sensor data indicating a lean forward by the user may be used by the computer application as an input to cause the user's viewpoint to move forward within the virtual reality scene. It should be appreciated that the types of user movement and corresponding actions within the virtual reality scene are essentially limitless, depending on the range of possible movements of the human body and the context of any given virtual reality scene.
Images received through the backside camera 105 of the electronic device 100 can be processed in conjunction with the images received through the frontside camera 103 of the electronic device 100 (by way of the optical receiver 121 and optical connection between the optical receiver 121 and frontside camera 103) to provide stereoscopic vision with depth perception of the real world present around the user. It should be understood that when the user is wearing the HMD 110 with the electronic device 100 inserted into the HMD 110, the user is fully immersed in the visual content displayed on the display screen 101 of the electronic device 100, and the user is unable to see the real world present around them. In various embodiments, the electronic device 100 is configured to process images received through the frontside camera 103 (by way of the optical receiver 121) and the backside camera 105 to determine locations of objects in the real world relative to the user of the HMD 110.
In some embodiments, optical properties of the optical receiver 121 can be configured to substantially match optical properties of the lens of the backside camera 105. The optical properties can include zoom setting and angle of view, among essentially any other optical property that characterizes the lens and/or optical components of the backside camera 105. Also, in some embodiments, optical properties of the optical receiver 121 can be configured to be intentionally different from optical properties of the lens and/or optical components of the backside camera 105. By configuring the optical properties of the optical receiver 121 to differ from the optical properties of the lens and/or optical components of the backside camera 105, correct optically-created stereo vision can be achieved within the HMD 110. Also, by configuring the optical properties of the optical receiver 121 to differ from the optical properties of the lens and/or optical components of the backside camera 105, it is possible to provide a mechanism for a dual-lens system to various applications executed for display within the HMD 110. In various embodiments, applications that may benefit from the dual-lens system can include improved photos/videos and improved object/hand detection, and can include wide, near, and long range narrow data for HMD/Phone Simultaneous Localization and Mapping (SLAM), among other improvements.
With stereoscopic vision capability, the electronic device 100 can be configured to accurately identify objects near the user in the real world and accurately determine both location and distance from the user to these identified objects. The electronic device 100 can also be configured to provide information to the user wearing the HMD 110 about the objects that are identified in the real world around the user. The information about objects present in the real world around the user can be conveyed to the user in many ways. For example, in some embodiments, objects identified in the real world can be shown on the display screen 101 of the electronic device 100. And, in some embodiments, the presence of objects identified in the real world can be communicated to the user through audible communication and/or through other types of visual communication by way of the display screen 101 of the electronic device 100 and/or through tactile communication, such as vibration, and/or through other forms of communication.
In some embodiments, identifying objects in the real world with depth perception provided by way of the stereoscopic vision mentioned above can improve safety of the user as the user moves physically within the real world while wearing the HMD 110. Also, in some embodiments, identifying objects in the real world with depth perception provided by way of the stereoscopic vision mentioned above can improve interactivity of the user with augmented reality applications in which the user's hand or other hand-held devices are moved within the real world to cause interaction with virtual objects displayed within the immersive visual content on the display screen 101 of the electronic device 100. And, in some embodiments, identifying objects in the real world with depth perception provided by way of the stereoscopic vision mentioned above can improve tracking of movement of the HMD 110. For example, in some embodiments, data about the accurate identification of objects and the accurate location and distance to the objects relative to the HMD 110, as provided through the stereoscopic vision mentioned above, can be combined with other HMD tracking data, such as inertial sensor data, to improve tracking of movement of the HMD 110, which can in turn be used to improve navigation of the user's viewpoint through the immersive visual content shown through the display screen 101 of the electronic device 100.
It should be understood that in order to achieve the stereoscopic vision, it is necessary to convey the visual input received through the optical receiver 121 around to the frontside camera 103 of the electronic device 100. This can be done in many different ways depending on the configuration of the HMD 110 and the configuration of the electronic device 100. In the example HMD 110 configuration of
In some embodiments, the optical conveyance device 131 is fully integrated within the HMD 110, such that the optical conveyance device 131 is not exposed outside of the HMD 110. However, in some embodiments, a portion of the optical conveyance device 131 may be exposed at the exterior surface of the HMD 110. For example, if the optical conveyance device 131 includes a focusing component, that focusing component may be exposed at the exterior surface of the HMD 110 to provide for focus adjustment. Also, it should be understood that the optical conveyance device 131 can include optical components, such as optical amplifiers, optical splitters, optical combiners, etc., as needed to accomplish conveyance of the visual input received through the optical receiver 121 around to the optical output 129.
The embodiments shown in
In some embodiments, the optical conveyance device 307 includes a combination of mirrors, waveguides, and/or prisms for projecting the visual input received through the optical receiver 305 onto the lens of the frontside camera 103 of the electronic device. In some embodiments, the optical conveyance device 307 includes a bundle of optical fibers configured to transmit light received through the optical receiver 305 around to the optical output 309. It should be understood that in various embodiments, the optical conveyance device 307 can be configured in different ways, so long as the visual input received through the optical receiver 305 is transmitted through the optical output 309 and onto the lens of the frontside camera 103 of the electronic device 100, so as to obtain the same effect as if the frontside camera 103 were positioned and oriented in the same manner as the optical receiver 305.
In some embodiments, the optical conveyance device 407 includes a combination of mirrors, waveguides, and/or prisms for projecting the visual input received through the optical receiver 405 onto the lens of the frontside camera 103 of the electronic device 100. In some embodiments, the optical conveyance device 407 includes a bundle of optical fibers configured to transmit light received through the optical receiver 405 around to the optical output 409. It should be understood that in various embodiments, the optical conveyance device 407 can be configured in different ways, so long as the visual input received through the optical receiver 405 is transmitted through the optical output 409 and onto the lens of the frontside camera 103 of the electronic device 100, so as to obtain the same effect as if the frontside camera 103 were positioned and oriented in the same manner as the optical receiver 405.
In some embodiments, the optical conveyance device 507 includes a combination of mirrors, waveguides, and/or prisms for projecting the visual input received through the optical receiver 505 onto the lens of the frontside camera 103 of the electronic device 100. In some embodiments, the optical conveyance device 507 includes a bundle of optical fibers configured to transmit light received through the optical receiver 505 around to the optical output 509. It should be understood that in various embodiments, the optical conveyance device 507 can be configured in different ways, so long as the visual input received through the optical receiver 505 is transmitted through the optical output 509 and onto the lens of the frontside camera 103 of the electronic device 100, when the electronic device 100 is installed in the case 300, so as to obtain the same effect as if the frontside camera 103 were positioned and oriented in the same manner as the optical receiver 505.
In some embodiments, the optical receivers 151 and 153 are positioned within the HMD 110A so that when the HMD 110A is worn by the user, the optical view axes 157 and 159 of the optical receivers 151 and 153 will be substantially aligned with respective eyes of the user. In this configuration, the visual input received through the optical receiver 151 and conveyed to the frontside camera 103 will be substantially equivalent to visual input that would normally be received through the right eye of the user in the absence of the HMD 110A. And, the visual input received through the optical receiver 153 and conveyed to the backside camera 105 will be substantially equivalent to visual input that would normally be received through the left eye of the user in the absence of the HMD 110A. In this manner, the visual input provided to the user's eyes through the optical receivers 151 and 153, the frontside camera 103, and the backside camera 105 can effectively represent the visual input that would normally be seen by the user's eyes in the absence of the HMD 110A, thereby providing the user with an ability to see the real world through the HMD 110A in a more realistic manner when wearing and using the HMD 110A. It should be appreciated that positioning of the optical receivers 151 and 153 to substantially align with the eyes of the user provides improved stereoscopic placement of visual inputs that get conveyed to the frontside camera 103 and the backside camera 105, which enables substantial matching of the user's real-world vision.
In some embodiments, the optical conveyance devices 163 and 167 can be fully integrated within the HMD 110A so as to not be exposed outside of the HMD 110A. However, in some embodiments, a portion of one or both of the optical conveyance devices 163 and 167 can be exposed at the exterior surface of the HMD 110A. For example, if the optical conveyance device 163/167 includes a focusing component, that focusing component may be exposed at the exterior surface of the HMD 110A to provide for focus adjustment. Also, it should be understood that one or both of the optical conveyance devices 163 and 167 can include optical components, such as optical amplifiers, optical splitters, optical combiners, etc., as needed to accomplish conveyance of the visual input received through the optical receivers 151 and 153, respectively, around to the optical outputs 161 and 165, respectively.
In accordance with the foregoing, an optical conveyance system for relocating a visual input position of an electronic device is disclosed herein. The electronic device (100) has a frontside camera (103) and a backside camera (105) respectively positioned on opposite sides of the electronic device (100). The optical conveyance system includes an optical receiver (121, 305, 405, 505) configured to receive visual input. The optical receiver (121, 305, 405, 505) is positioned to have a field of view that overlaps a field of view of the backside camera (105) of the electronic device (100). The optical conveyance system also includes an optical output (129, 309, 409, 509) configured to project visual output. The optical output (129, 309, 409, 509) is positioned over the frontside camera (103) of the electronic device (100) to project visual output into the frontside camera (103) of the electronic device (100). The optical conveyance system also includes an optical conveyance device (131, 307, 407, 507) optically connected between the optical receiver (121, 305, 405, 505) and the optical output (129, 309, 409, 509). The optical conveyance device (131, 307, 407, 507) is configured to convey the visual input received through the optical receiver (121, 305, 405, 505) to the optical output (129, 309, 409, 509) for projection as the visual output into the frontside camera (103) of the electronic device (100).
In some embodiments, the optical receiver (121, 305, 405, 505) includes a camera lens for receiving the visual input. In some embodiments, the optical output (129, 309, 409, 509) includes a lens for projecting the visual output into the frontside camera (103) of the electronic device (100). In some embodiments, the optical conveyance device (131, 307, 407, 507) includes a combination of optical components, where the optical components can include one or more of a mirror, a waveguide, a prism, and an optical fiber. In some embodiments, the combination of optical components is configured such that the visual output projected into the frontside camera (103) of the electronic device (100) is equivalent to the visual input received through the optical receiver (121, 305, 405, 505) as if the frontside camera (103) of the electronic device (100) were positioned and oriented in a same manner as the optical receiver (121, 305, 405, 505). In some embodiments, the optical conveyance device (131, 307, 407, 507) traverses through at least two orthogonal changes in direction between the optical receiver (121, 305, 405, 505) and the optical output (129, 309, 409, 509).
In some embodiments, the optical receiver (121), the optical conveyance device (131), and the optical output (129) are integrated within a head-mounted display device (110). In some embodiments, the head-mounted display device (110) is configured to optically transform images displayed on a display screen (101) of the electronic device (100) into immersive visual content. In some embodiments, the head-mounted display device (110) is configured to receive and hold the electronic device (100) in a position in which an optical view axis (125) of the optical receiver (121) is substantially parallel to and commonly oriented with an optical view axis (127) of the backside camera (105) of the electronic device (100), and in which the optical output (129) is positioned over the frontside camera (103) of the electronic device (100). In some embodiments, a separation distance (123) measured perpendicularly between the optical view axis (125) of the optical receiver (121) and the optical view axis (127) of the backside camera (105) is about 63 millimeters. In some embodiments, the separation distance (123) measured perpendicularly between the optical view axis (125) of the optical receiver (121) and the optical view axis (127) of the backside camera (105) is adjustable.
In some embodiments, the optical receiver (305), the optical conveyance device (307), and the optical output (309) are integrated within a case (300) for the electronic device (100), such that when the electronic device (100) is inserted into the case (300) the optical output (309) covers the frontside camera (103) of the electronic device (100) and an optical view axis (313) of the optical receiver (305) is substantially parallel to and commonly oriented with an optical view axis (315) of the backside camera (105) of the electronic device (100). In some embodiments, with the optical receiver (305), the optical conveyance device (307), and the optical output (309) integrated within the case (300) for the electronic device (100), a separation distance (317) measured perpendicularly between the optical view axis (313) of the optical receiver (305) and the optical view axis (315) of the backside camera (105) is about 63 millimeters when the electronic device (100) is inserted into the case (300). In some embodiments, with the optical receiver (305), the optical conveyance device (307), and the optical output (309) integrated within the case (300) for the electronic device (100), the separation distance (317) measured perpendicularly between the optical view axis (313) of the optical receiver (305) and the optical view axis (315) of the backside camera (105) is adjustable. In some embodiments, the case (300) with the electronic device (100) inserted into the case (300) is installed within a head-mounted display device (110). In these embodiments, the head-mounted display device (110) is configured to optically transform images displayed on a display screen (101) of the electronic device (100) into immersive visual content.
In some embodiments, the optical receiver (405, 505), the optical conveyance device (407, 507), and the optical output (409, 509) are integrated within a clip (401, 501) configured to attach to the electronic device (100). The clip (401, 501) is configured such that when the clip (401, 501) is attached to the electronic device (100) the optical output (409, 509) covers the frontside camera (103) of the electronic device (100) and an optical view axis (413, 513) of the optical receiver (405, 505) is substantially parallel to and commonly oriented with an optical view axis (415, 515) of the backside camera (105) of the electronic device (100). In some embodiments, a separation distance (417, 517) measured perpendicularly between the optical view axis (413, 513) of the optical receiver (405, 505) and the optical view axis (415, 515) of the backside camera (105) is about 63 millimeters when the clip (401, 501) is attached to the electronic device (100). In some embodiments, the separation distance (417, 517) measured perpendicularly between the optical view axis (413, 513) of the optical receiver (405, 505) and the optical view axis (415, 515) of the backside camera (105) is adjustable. In some embodiments, the clip (501) is configured to fit over a case (300) for the electronic device (100). In some embodiments, the electronic device (100) with the clip (401, 501) attached to the electronic device (100) is installed within a head-mounted display device (110). The head-mounted display device (110) is configured to optically transform images displayed on a display screen (101) of the electronic device (100) into immersive visual content.
In some embodiments of the method of
In some embodiments, the method of
In some embodiments, the method of
In some embodiments, the electronic device 100 can use its native wireless communication circuitry to communicate with a network 703, such as the Internet. The electronic device 100 can communicate to various content sites including cloud gaming content 705, cloud entertainment content 707, social media content 709, or any other type of content that is accessible over the Internet or private networks. In some embodiments, the content being accessed by the electronic device 100 can be downloaded and executed by the electronic device 100 once the content has been received. In some embodiments, the content can be streamed from the Internet source, and delivered to the electronic device 100 when needed. In some embodiments, the content is streamed from a service provider that provides games, content, entertainment content, or other multimedia for consumption by the electronic device 100 when the electronic device 100 is used with the HMD 110.
The HMD 110B can include optical components positioned between the user's eyes and the display screen 101 of the electronic device 100. The optical components can be configured to provide viewing of the content shown on the display screen 101 of the electronic device 100, and optimize the rendering, sizing, re-sizing, sharpness, prescription, and/or other distortion or non-distortion adjustments of the content shown on the display screen 101.
In some embodiments, activation of a see-through mode (i.e., transparent or semi-transparent mode) is used when the user 701 is interacting with a virtual scene displayed by the electronic device 100 within the HMD 110B. The see-through mode may be activated when the user 701 wishes to disconnect from the virtual scene to interact with another person, take a phone call, pause game, pause session, or conduct an interactive session or communication. The see-through mode can also be automatically triggered, such as when signals are received or when safety alerts or notifications are needed.
In some embodiments, the visual content shown on the display screen 101 of the electronic device 100 defines a virtual reality scene, such as the example virtual reality scene 901. In some embodiments, both the backside camera 105 of the electronic device 100 and the optical receiver (121, 305, 405, 505) can be operated to monitor a forward view of the real world from the HMD 110 to assist the user 701 in interacting with the virtual reality scene, such as by allowing the user 701 to place an image of their hand 903 into the virtual reality scene 901 to interact with the content in the virtual reality scene 901. The stereoscopic vision provided by the combination of the backside camera 105 of the electronic device 100 and the optical receiver (121, 305, 405, 505) enables depth perception within the real world environment around the user 701, which in turn enables more accurate determination of how physical actions of the user 701 are intended to correspond to virtual interactions of the user within the virtual reality scene 901. For example, the user 701 can position their hand 903 within the real world into the fields of view of the backside camera 105 of the electronic device 100 and the optical receiver (121, 305, 405, 505), and this positioning of the user's hand is detected by software operating on the electronic device 100 to allow blending of a virtual image of the user's hand into the virtual reality scene 901. In various embodiments, the user's hand can be shown in the virtual reality scene 901 as an image of the user's actual hand or as a computer generated image of the user's hand, or as augmented reality, or as a blend of augmented and real-world images. It should be understood that the example discussed above regarding use of the backside camera 105 of the electronic device 100 and the optical receiver (121, 305, 405, 505) to provide stereoscopic vision for detecting the user's hand in the real world and translating that into a position of the user's hand in the virtual reality scene 901 is one of an essentially unlimited number of ways in which the stereoscopic vision as afforded by the present invention can be utilized.
In some embodiments, the backside camera 105 and the combination of the optical conveyance system (i.e., optical receiver (121, 305, 405, 505), optical conveyance device (131, 307, 407, 507), and optical output (129, 309, 409, 509)) and frontside camera 103 provides a stereoscopic view of the real world when the user desires to view the real world and/or exit the virtual reality scene. Also, in some embodiments, the backside camera 105 and the combination of the optical conveyance system and frontside camera 103 provides a stereoscopic view of the real world that can be used to provide safety notifications to the user 701 while the user 701 is interacting with the virtual reality scene 901. For example, if the user 701 walks around a particular real-world space while interacting with the virtual reality scene 901 and is dangerously approaching stairs, a wall, or some other object, the stereoscopic vision can be used to detect those objects/dangers and enable the electronic device 100 to provide a notification to the user 701 within the HMD 110, such as by providing a message, a notification, an alarm, a sound, tactile feedback, or the like, to the user 701.
The backside camera 105 and the combination of the optical conveyance system and frontside camera 103 is useful for providing safety notifications about the real world space around the user 701 when the user 701 becomes immersed in the virtual reality scene 901. The backside camera 105 and the combination of the optical conveyance system and frontside camera 103 can also be used to provide the user 701 with transitions out of the virtual reality scene into the real world space by enabling partially transparent views, fully transparent views, blends of fully and partially transparent views, or partial views that may show actual features that are of interest to the user 701 or may be dangerous when the user 701 is wearing the HMD 110. Additionally, the backside camera 105 and the combination of the optical conveyance system and frontside camera 103 can be used to enable more accurate tracking of movement of the HMD 110.
Embodiments of the present invention may be practiced with various computer system configurations including hand-held devices, microprocessor systems, microprocessor-based or programmable consumer electronics, minicomputers, mainframe computers and the like. The invention can also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a wire-based or wireless network.
With the above embodiments in mind, it should be understood that the invention can employ various computer-implemented operations involving data stored in computer systems. These operations are those requiring physical manipulation of physical quantities. Any of the operations described herein that form part of the invention are useful machine operations. The invention also relates to a device or an apparatus for performing these operations. The apparatus can be specially constructed for the required purpose, or the apparatus can be a general-purpose computer selectively activated or configured by a computer program stored in the computer. In particular, various general-purpose machines can be used with computer programs written in accordance with the teachings herein, or it may be more convenient to construct a more specialized apparatus to perform the required operations.
Although the foregoing invention has been described in some detail for purposes of clarity of understanding, it will be apparent that certain changes and modifications can be practiced within the scope of the appended claims. Accordingly, the present embodiments are to be considered as illustrative and not restrictive, and the invention is not to be limited to the details given herein, but may be modified within the scope and equivalents of the described embodiments.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
10373992, | Aug 09 2016 | Apple Inc. | Compact camera module |
10416455, | May 20 2015 | LG Electronics Inc | Head mounted display |
10460445, | Jun 06 2011 | Microsoft Technology Licensing, LLC | Object digitization |
8345144, | Jul 15 2009 | Adobe Inc | Methods and apparatus for rich image capture with focused plenoptic cameras |
20080079839, | |||
20080143895, | |||
20120270600, | |||
20130127997, | |||
20140055746, | |||
20140369575, | |||
20150103146, | |||
20150172522, | |||
20150348327, | |||
20150362733, | |||
20160180591, | |||
20160267720, | |||
20160349509, | |||
20170097512, | |||
20170123217, | |||
20170126937, | |||
20170206691, | |||
20170307896, | |||
20170336915, | |||
20180017803, | |||
20180292665, | |||
20190086674, | |||
20190086676, | |||
20190109938, | |||
20190220002, | |||
20190289107, | |||
DE102013020756, | |||
KR20130022497, | |||
WO2015183621, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 30 2017 | STAFFORD, JEFFREY | SONY INTERACTIVE ENTERTAINMENT INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 043468 | /0096 | |
Aug 31 2017 | SONY INTERACTIVE ENTERTAINMENT INC. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Aug 31 2017 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
May 10 2024 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Nov 10 2023 | 4 years fee payment window open |
May 10 2024 | 6 months grace period start (w surcharge) |
Nov 10 2024 | patent expiry (for year 4) |
Nov 10 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 10 2027 | 8 years fee payment window open |
May 10 2028 | 6 months grace period start (w surcharge) |
Nov 10 2028 | patent expiry (for year 8) |
Nov 10 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 10 2031 | 12 years fee payment window open |
May 10 2032 | 6 months grace period start (w surcharge) |
Nov 10 2032 | patent expiry (for year 12) |
Nov 10 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |