A virtual image system is disclosed comprising: a passive optical unit for interfacing with portions of a user's head; and an optical projection unit directing a retinal scanning beam into the passive optical unit, the optical projection unit detached from the passive optical unit.
|
1. A virtual image system comprising:
a passive optical unit for interfacing with portions of a user's head; a plurality of optical projection units directing retinal scanning beams into said passive optical unit, wherein said optical projection units are detached from said passive optical unit; and wherein said optical projection units are disposed across an area such that when said user moves out of range of one of said optical projection units, said user is within the range of another of said optical projection units.
13. A virtual image system comprising:
a tracking unit generating positional data responsive to a position of a passive optical unit, said passive optical unit adapted to fit around portions of a user's head; a control unit generating control signals responsive to said positional data; and one or more optical projection units directing a retinal scanning beams into lenses of said passive optical unit responsive to said control signals, said optical projection units disposed across an area such that when said user moves out of range of one of said optical projection units, said user is within the range of another of said optical projection units.
2. The virtual image system as claimed in
3. The virtual image system as claimed in
4. The virtual image system as claimed in
5. The virtual image system as claimed
6. The virtual image system as claimed in
7. The virtual image system as claimed in
8. The virtual image system as claimed in
9. The virtual image system as claimed in
10. The virtual image system as claimed in
11. The virtual image system as claimed in
12. The virtual image system as claimed in
15. The virtual image system as claimed in
16. The virtual image system as claimed in
17. The virtual image system as claimed in
18. The virtual image system as claimed in
19. The virtual image system as claimed in
20. The virtual image system as claimed in
21. The virtual image system as claimed in
22. The virtual image system as claimed in
23. The virtual image system as claimed in
24. The virtual image system as claimed in
25. The virtual image system as claimed in
26. The virtual image system as claimed in
|
1. Field of the Invention
This invention relates generally to the field of virtual display systems. More particularly, the invention relates to a system and method for scanning a virtual image on a retina.
2. Description of the Related Art
Traditional display systems such as television and computer monitors produce "real" images. This means that the light waves producing the image actually emanate from the image point. Thus, as illustrated in
There are several problems with real image display systems. For example, these systems are typically large and cumbersome and require substantial amounts of power to operate (e.g., to illuminate the individual pixels on the CRT). In addition, these systems to not provide an adequate level of privacy. Anyone within the visual range of a CRT monitor 120 is able to view its contents, making these systems ill-suited for viewing confidential material in public places
To solve some of these problems "virtual" display systems were developed. In contrast to a "real" display system, a "virtual" display system is one in which the light producing the image does not actually emanate from the image--it only appears to.
One type of virtual display system scans photons which contain image data directly onto the retina of a user's eye. As illustrated in
Although the virtual display system illustrated in
One obvious problem with this configuration is that it forces a user to wear a heavy, bulky apparatus which may dissipate a substantial amount of heat. While such a system may be acceptable for specialized purposes (e.g., an army helicopter helmet, medical surgery glasses . . . etc) it is unlikely that mainstream users are willing to wear such a device on a regular basis.
Accordingly, what is needed is a virtual display system and method which solves the foregoing problems.
A virtual image system is disclosed comprising: a passive optical unit for interfacing with portions of a user's head; and an optical projection unit directing a retinal scanning beam into the passive optical unit, the optical projection unit detached from the passive optical unit.
All figures described below are two-dimensional for the sake of illustration. A better understanding of the present invention can be obtained from the following detailed description in conjunction with the drawings, in which:
In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced without some of these specific details. In other instances, well-known structures and devices are shown in block diagram form to avoid obscuring the underlying principles of the invention.
In the embodiment of the system illustrated in
Although the embodiment illustrated in
In addition, the glasses or contact lenses 330 worn by the user may be holographic glasses/lenses. As it is known in the art, a holographic lens can be configured so that each point within the lens will refract the incoming scanning beam at specified angles. Thus, holographic lenses provide a relatively inexpensive mechanism for directing the scanning beam into the user's retinas in a precise manner.
In
Control unit 430 may be comprised of hardware or any hardware/software combination. In one embodiment, control unit 430 is a computer system 500 such as the one illustrated in FIG. 5. Computer system 500 comprises a system bus 520 for communicating information, and a processor 510 coupled to bus 520 for processing information. Computer system 500 may further comprise a random access memory ("RAM") or other dynamic storage device 525 (referred to herein as main memory), coupled to bus 520 for storing information and instructions to be executed by processor 510. Main memory 525 also may be used for storing temporary variables or other intermediate information during execution of instructions by processor 510. Computer system 500 also may include a read only memory (ROM) and/or other static storage device 526 coupled to bus 520 for storing static information and instructions used by processor 510.
A data storage device 527 such as a magnetic disk or optical disc and its corresponding drive may also be coupled to computer system 500 for storing information and instructions. Computer system 500 can also be coupled to an I/O bus 550 via an I/O interface 530. A plurality of I/O devices may be coupled to I/O bus 550, including, for example, a display device 543, an input device (e.g., an alphanumeric input device 542 and/or a cursor control device 541). In one embodiment of the system, a communication device 240 receives the positional signal 420 and transmits the control signal 440 in response.
Tracking unit 410 tracks the position and angle of the pair of glasses 330 or contact lenses using various tracking techniques. For example, in one embodiment of the system, tracking unit 410 is a camera which (in combination with control unit 430) implements robotic vision techniques similar to those used to identify parts on manufacturing assembly lines. In an embodiment in which the user wears contact lenses rather than glasses, the tracking unit may track the user's pupils and/or other features of the user's face.
In the embodiment illustrated in
In an embodiment where contact lenses are used instead of glasses, these three positional points 710-712 may actually be portions of a user's face. For example, tracking unit 410 in this embodiment may track the position of the user's two pupils and the tip of the user's nose. Control unit 430 in this embodiment may be pre-programmed with the spatial relationship between these three points. Alternatively, a reflective substance may be painted/stamped on the user's face and/or forehead to produce the target points. Ideally, this substance would be one which reflects a particular type of signal/waveform not visible to the human eye (e.g., an infrared signal).
In one embodiment of the system, optical projection unit 310 (in response to control signal 440) modifies the scanned beam 340 differently based on the particular motion of the user's face and/or pair of glasses. For example, as illustrated in
When the user's face and/or pair of glasses is rotated within the plane perpendicular to the direction of the projected beam 340, however, (as indicated by arrow 830) compensation by the projection unit 310 is slightly more complex. In one embodiment of the system, the raster scanning unit within the projection unit 310 includes an adjustable sweep circuit which modifies the horizontal and vertical deflection voltages of the scanning unit. This causes the image projected on the user's retina to rotate along with the user's head and/or glasses. In another embodiment, the scanning beam may be rotated via the direct, physical movement of one or more lenses and/or mirrors within the projection unit. Regardless of which techniques are used to compensate for the user's motion, however, the underlying principles of the invention remain the same.
A third type of motion for which compensation is required is a motion which changes the angle of incidence between the scanning beam 340 and the user's face and/or glasses. One example of this type of motion is illustrated by arrows 820-821 in
To further improve system privacy, a bar code may be stamped on the user's glasses. This code will then be read by tracking unit 410 and verified (e.g., matched with the user's ID number) before control unit 430 will allow the projection unit 310 to generate the scanning beam 340. Alternatively, or in addition, the projection unit 310 (or other retinal identification apparatus) may initially scan and retrieve a "snapshot" of the user's retina. This retinal image will uniquely identify the user as authorized or unauthorized to view the scanned image. Any of the foregoing identification techniques will prevent unauthorized users from viewing potentially confidential information.
As illustrated in
The same type of bifocal technology as that described above may be implemented with contact lenses. In addition, if contact lenses are used, they may be appropriately weighted so that they are pulled into position by gravity once inserted.
In another embodiment, illustrated in
Furthermore, because the virtual image is projected directly into the users'retinas, the virtual information plane 930 can be viewed anywhere--even in an outdoor area on a bright, sunny day. By contrast, monitors using current CRT technology are incapable of producing enough light to be visible under these conditions.
The number of optical projection units 910, 911 necessary for implementing this embodiment of the invention will depend on the maximum scanning frequency of each of the optical projection units 910, 911--i.e., the number of individual retinas that a single unit can scan at some minimum reasonable scanning rate. What is "reasonable" will depend on the minimum number of frames/second (i.e., the number of times that a single retina is completely scanned in a second) which will allow a human to comfortably view the scanned image (i.e., without a significant "flicker"). This minimum frame rate is typically proportional to the brightness of the projected or scanned image. For example, a bright computer screen may require a higher scanning rate (e.g., 75 Hz or greater) than in a relatively dim movie theater (e.g., 48 Hz).
In one embodiment of the virtual display system, the user's headset includes a microphone and/or earpiece for transmitting/receiving an audio signal from the public optical projection unit 910, thereby expanding the range of potential applications for the virtual display system. For example, this embodiment would allow the user to participate in video conferences.
The virtual display system described herein may be used for a variety of additional applications. For example, the system can replace big-screen television sets or other monitors which take up a substantial amount of floor space. The system can also replace large video projection systems (e.g., those used in movie theaters or large conferences), and thereby eliminate the need for large, flat projection surfaces.
In one embodiment of the system, an entire room may be filled with tracking units 410 and/or optical projection units 310 to produce a virtual reality environment. The optical projection units 310 may be spaced sufficiently close together such that as soon as the user moves out of range of one projection unit 310, he or she falls within the range of a second projection unit 310. One or more tracking units 410 continually track the direction in which the user is looking at any given time and transmit this positional data to a central control unit 430 (or several control units which communicate to one another).
In addition, numerous different users can interact within the virtual reality environment. For example, in one embodiment, a motion capture system such as those used by the motion picture industry and the video game industry tracks the motion of each user and transmits this information back (in the form of a video image) to the other users. In one embodiment, this is accomplished by affixing magnetic sensors on specific parts of each users'body and then generating a magnetic field within the virtual space which is distorted by the sensors. In another embodiment, reflective elements are placed on the user and optical sensors are positioned throughout the virtual space.
Regardless of which type of motion capture technology is used, the resulting motion data is transmitted to the central control unit 430 (or other control/computer system) which, in response, generates (via optical projection unit 310) an image representing each of the users. For example, in a "virtual nightclub," individuals would be able to select from a variety of different nightclub characters before entering into the virtual nightclub environment. The control unit 430 would then generate the selected character and project the character's image through optical projection unit 310 to all of the user's in the virtual reality environment.
One important benefit of the foregoing system over prior art systems is that users in the present system are only required to use a lightweight, inexpensive pair of glasses or contact lenses. By contrast, users of prior art virtual reality systems are required to wear heavy, bulky, typically more expensive headsets or helmets due to the electrical and optical components included therein. As such, using the present system and method, an owner of a virtual reality service need not worry about users damaging or stealing sensitive, expensive equipment. Accordingly, the present system also provides the benefit of reduced operating costs.
Finally, in one embodiment, the virtual reality system is communicatively coupled to other virtual reality systems over a network (e.g., the Internet). In this manner, users from across the globe can participate in a single virtual reality event. This may be accomplished, for example, by providing the control unit 430 with a high speed network card for connecting the control unit 430 to a high speed local area network and/or a high speed Internet connection such as a fractional T-1 or DSL channel.
Throughout the foregoing description, for the purposes of explanation, numerous specific details were set forth in order to provide a thorough understanding of the present system and method. It will be apparent, however, to one skilled in the art that the system and method may be practiced without some of these specific details. For example, the embodiments of the system and method illustrated in
Patent | Priority | Assignee | Title |
10108832, | Dec 30 2014 | Hand Held Products, Inc. | Augmented reality vision barcode scanning system and method |
10116846, | Sep 03 2013 | Tobii AB | Gaze based directional microphone |
10120646, | Feb 11 2005 | Oakley, Inc. | Eyewear with detachable adjustable electronics module |
10180717, | Feb 20 2015 | Sony Corporation | Information processing device, information processing method, and program |
10222617, | Dec 22 2004 | Oakley, Inc. | Wearable electronically enabled interface system |
10269169, | Mar 16 2005 | Lucasfilm Entertainment Company Ltd. | Three-dimensional motion capture |
10277787, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
10288886, | Dec 14 2006 | Oakley, Inc. | Wearable high resolution audio visual interface |
10288908, | Jun 12 2013 | Oakley, Inc. | Modular heads-up display system |
10310597, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
10375283, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
10389924, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
10686972, | Sep 03 2013 | Tobii AB | Gaze assisted field of view control |
10708477, | Sep 03 2013 | Tobii AB | Gaze based directional microphone |
10895743, | Jun 28 2016 | Hologram Industries Research GmbH | Display apparatus for superimposing a virtual image into the field of vision of a user |
11082685, | Nov 05 2019 | Universal City Studios LLC | Head-mounted device for displaying projected images |
11614624, | Mar 23 2018 | Sony Semiconductor Solutions Corporation | Display apparatus |
11825068, | Nov 05 2019 | Universal City Studios LLC | Head-mounted device for displaying projected images |
7253791, | Nov 13 2003 | International Business Machines Corporation | Selective viewing enablement system |
7848564, | Mar 16 2005 | Lucasfilm Entertainment Company Ltd | Three-dimensional motion capture |
8019137, | Mar 16 2005 | Lucasfilm Entertainment Company Ltd | Three-dimensional motion capture |
8130225, | Jan 16 2007 | Lucasfilm Entertainment Company Ltd | Using animation libraries for object identification |
8144153, | Nov 20 2007 | Lucasfilm Entertainment Company Ltd. | Model production for animation libraries |
8199152, | Jan 16 2007 | Lucasfilm Entertainment Company Ltd | Combining multiple session content for animation libraries |
8390533, | Nov 20 2007 | Panasonic Corporation | Beam-scan display apparatus, display method, and vehicle |
8542236, | Jan 16 2007 | Lucasfilm Entertainment Company Ltd | Generating animation libraries |
8570372, | Apr 29 2011 | LUMINAR TECHNOLOGIES, INC | Three-dimensional imager and projection device |
8681158, | Jan 16 2007 | Lucasfilm Entertainment Company Ltd. | Using animation libraries for object identification |
8760499, | Apr 29 2011 | LUMINAR TECHNOLOGIES, INC | Three-dimensional imager and projection device |
8786520, | Sep 04 2008 | INNOVEGA, INC | System and apparatus for display panels |
8876285, | Dec 14 2006 | Oakley, Inc. | Wearable high resolution audio visual interface |
8908960, | Mar 16 2005 | Lucasfilm Entertainment Company Ltd. | Three-dimensional motion capture |
8928674, | Jan 16 2007 | Lucasfilm Entertainment Company Ltd. | Combining multiple session content for animation libraries |
8941665, | Nov 20 2007 | Lucasfilm Entertainment Company Ltd. | Model production for animation libraries |
8948447, | Jul 12 2011 | Lucasfilm Entertainment Company Ltd | Scale independent tracking pattern |
9142024, | Dec 31 2008 | Lucasfilm Entertainment Company Ltd. | Visual and physical motion sensing for three-dimensional motion capture |
9256778, | Jul 12 2011 | Lucasfilm Entertainment Company Ltd. | Scale independent tracking pattern |
9401025, | Dec 31 2008 | Lucasfilm Entertainment Company Ltd. | Visual and physical motion sensing for three-dimensional motion capture |
9424679, | Mar 16 2005 | Lucasfilm Entertainment Company Ltd. | Three-dimensional motion capture |
9451068, | Jun 21 2001 | Oakley, Inc. | Eyeglasses with electronic components |
9494807, | Dec 14 2006 | Oakley, Inc. | Wearable high resolution audio visual interface |
9508176, | Nov 18 2011 | Lucasfilm Entertainment Company Ltd | Path and speed based character control |
9596391, | Sep 03 2013 | Tobii AB | Gaze based directional microphone |
9619201, | Jun 02 2000 | Oakley, Inc. | Eyewear with detachable adjustable electronics module |
9665172, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
9672417, | Jul 12 2011 | Lucasfilm Entertainment Company, Ltd. | Scale independent tracking pattern |
9710058, | Sep 03 2013 | Tobii AB | Portable eye tracking device |
9720240, | Dec 14 2006 | Oakley, Inc. | Wearable high resolution audio visual interface |
9720258, | Mar 15 2013 | Oakley, Inc. | Electronic ornamentation for eyewear |
9720260, | Jun 12 2013 | Oakley, Inc. | Modular heads-up display system |
9911230, | Dec 06 2010 | Samsung Electronics Co., Ltd. | Method and apparatus for controlling virtual monitor |
Patent | Priority | Assignee | Title |
5305124, | Apr 07 1992 | HE HOLDINGS, INC , A DELAWARE CORP ; Raytheon Company | Virtual image display system |
5440428, | Sep 30 1993 | Delphi Technologies Inc | Automotive instrument 3-D virtual image display |
5467104, | Oct 22 1992 | BOARD OF REGENTS OF THE UNIVESITY OF WASHINGTON | Virtual retinal display |
5596339, | Oct 22 1992 | University of Washington | Virtual retinal display with fiber optic point source |
5653751, | Dec 07 1994 | Systems and methods for projecting an image onto a retina | |
5659327, | Oct 22 1992 | Board of Regents of the University of Washington | Virtual retinal display |
5701132, | Mar 29 1996 | University of Washington | Virtual retinal display with expanded exit pupil |
5712649, | Nov 01 1991 | Sega Enterprises | Head-mounted image display |
5821989, | Jun 11 1990 | VREX, INC ; Reveo, Inc | Stereoscopic 3-D viewing system and glasses having electrooptical shutters controlled by control signals produced using horizontal pulse detection within the vertical synchronization pulse period of computer generated video signals |
5864384, | Jul 31 1996 | MASSENGILL FAMILY TRUST R KEMP MASSENGILL & ESTHER S MASSENGILL, CO-TRUSTEES | Visual field testing method and apparatus using virtual reality |
5883606, | Dec 17 1996 | NYTELL SOFTWARE LLC | Flat virtual displays for virtual reality |
5903397, | May 04 1998 | University of Washington; Washington, University of | Display with multi-surface eyepiece |
5910834, | May 28 1997 | MASSENGILL FAMILY TRUST R KEMP MASSENGILL & ESTHER S MASSENGILL, CO-TRUSTEES | Color on color visual field testing method and apparatus |
5940166, | Sep 12 1997 | Binocular indirect ophthalmoscope | |
6043799, | Feb 20 1998 | Washington, University of | Virtual retinal display with scanner array for generating multiple exit pupils |
6097353, | Jan 20 1998 | Washington, University of | Augmented retinal display with view tracking and data positioning |
6154321, | Jan 20 1998 | University of Washington | Virtual retinal display with eye tracking |
6229503, | Aug 25 1998 | AURORA DISCOVERY, INC | Miniature personal display |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 31 2000 | Stephen G., Perlman | (assignment on the face of the patent) | / | |||
Jun 30 2000 | PERLMAN, STEPHEN | REARDEN STEEL | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 010974 | /0110 | |
Dec 06 2001 | REARDEN STEEL | PERLMAN, STEPHEN G | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 012229 | /0005 | |
Mar 02 2005 | PERLMAN, STEPHEN G | REARDEN, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 015942 | /0458 | |
Jun 30 2006 | REARDEN, INC | STEPHEN G PERLMAN REVOCABLE TRUST | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 019690 | /0503 | |
Jun 30 2006 | STEPHEN G PERLMAN REVOCABLE TRUST | REARDEN, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 019744 | /0544 |
Date | Maintenance Fee Events |
Sep 06 2005 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Sep 08 2009 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Sep 05 2013 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Feb 20 2015 | LTOS: Pat Holder Claims Small Entity Status. |
Date | Maintenance Schedule |
Mar 05 2005 | 4 years fee payment window open |
Sep 05 2005 | 6 months grace period start (w surcharge) |
Mar 05 2006 | patent expiry (for year 4) |
Mar 05 2008 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 05 2009 | 8 years fee payment window open |
Sep 05 2009 | 6 months grace period start (w surcharge) |
Mar 05 2010 | patent expiry (for year 8) |
Mar 05 2012 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 05 2013 | 12 years fee payment window open |
Sep 05 2013 | 6 months grace period start (w surcharge) |
Mar 05 2014 | patent expiry (for year 12) |
Mar 05 2016 | 2 years to revive unintentionally abandoned end. (for year 12) |