A display waveguide configured for conveying polychromatic image light to a viewer includes a substrate and a higher-index layer supported by the substrate. The high-index layer supports the transmission of the longer-wavelength color channel of the image light in at least a portion of the field of view.
|
1. A waveguide for conveying image light to an eyebox, the waveguide comprising:
a substrate of a first refractive index n1;
a high-index layer of a second refractive index n2 supported by the substrate without an air gap therebetween, wherein n2 is greater than n1;
an input coupler at a first location along the waveguide; and
an output coupler at a second, different location along the waveguide;
wherein the input coupler is configured to couple a first portion of the image light into the waveguide for propagating toward the output coupler partly in the substrate, and to couple a second portion of the image light into the waveguide for propagating toward the output coupler within the high-index layer; and
wherein the output coupler is configured to couple the image light out of the waveguide toward the eyebox, the output coupler comprising a first grating and a second grating, wherein at least one of the first or second gratings is configured to redirect light propagating in the high-index layer.
10. A waveguide for conveying image light to an eyebox, the waveguide comprising:
a substrate of a first refractive index n1;
a high-index layer of a second refractive index n2 supported by the substrate without an air gap therebetween, wherein n2 is greater than n1;
an input coupler configured to couple the image light into the high-index layer; and
an output coupler configured to couple the image light out of the waveguide toward the eyebox, the output coupler comprising a first grating and a second grating, wherein at least one of the first or second gratings is configured to redirect light propagating in the high-index layer;
wherein the image light comprises a first color channel and a second color channel, wherein the second color channel comprises longer wavelengths than the first color channel, and wherein the input coupler is configured to propagate at least a portion of the second color channel in the high-index layer by total internal reflection (TIR) at an interface between the high-index layer and the substrate.
2. The waveguide of
3. The waveguide of
4. The waveguide of
5. The waveguide of
6. The waveguide of
11. The waveguide of
12. The waveguide of
13. The waveguide of
14. The waveguide of
15. The waveguide of
16. The waveguide of
17. The waveguide of
18. The waveguide of
19. The waveguide of
20. The waveguide of
|
This application claims priority to U.S. Provisional Application No. 62/926,053 entitled “Display Waveguide with a High-Index Layer” filed on Oct. 25, 2019 and incorporated herein by reference in its entirety.
The present disclosure generally relates to optical display systems and devices, and in particular to waveguide displays and components therefor.
Head mounted displays (HMD), helmet mounted displays, near-eye displays (NED), and the like are being used increasingly for displaying virtual reality (VR) content, augmented reality (AR) content, mixed reality (MR) content, etc. Such displays are finding applications in diverse fields including entertainment, education, training and biomedical science, to name just a few examples. The displayed VR/AR/MR content can be three-dimensional (3D) to enhance the experience and to match virtual objects to real objects observed by the user. Eye position and gaze direction, and/or orientation of the user may be tracked in real time, and the displayed imagery may be dynamically adjusted depending on the user's head orientation and gaze direction, to provide a better experience of immersion into a simulated or augmented environment.
Compact display devices are desired for head-mounted displays. Because a display of HMD or NED is usually worn on the head of a user, a large, bulky, unbalanced, and/or heavy display device would be cumbersome and may be uncomfortable for the user to wear.
Projector-based displays provide images in angular domain, which can be observed by a user's eye directly, without an intermediate screen or a display panel. An imaging waveguide may be used to carry the image in angular domain to the user's eye. The lack of a screen or a display panel in a projector display enables size and weight reduction of the display.
Embodiments disclosed herein will be described in greater detail with reference to the accompanying drawings which represent example embodiments thereof, in which like elements are indicated with like reference numerals, and wherein:
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular optical and electronic circuits, optical and electronic components, techniques, etc. in order to provide a thorough understanding of the present invention. However, it will be apparent to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known methods, devices, and circuits are omitted so as not to obscure the description of the example embodiments. All statements herein reciting principles, aspects, and embodiments, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Note that as used herein, the terms “first”, “second”, and so forth are not intended to imply sequential ordering, but rather are intended to distinguish one element from another, unless explicitly stated. Similarly, sequential ordering of method or process steps does not imply a sequential order of their execution, unless explicitly stated.
Furthermore, the following abbreviations and acronyms may be used in the present document: HMD (Head Mounted Display); NED (Near Eye Display); VR (Virtual Reality); AR (Augmented Reality); MR (Mixed Reality); LED (Light Emitting Diode); FOV (Field of View); TIR (Total Internal Reflection); HI (High Index). The terms “NED” and “HMD” may be used herein interchangeably.
Example embodiments may be described hereinbelow with reference to polychromatic light that is comprised of three distinct color channels. The color channel with the shortest wavelengths may be referred to as the blue (B) channel or color, and may represent the blue channel of an RGB color scheme. The color channel with the longest wavelengths may be referred to as the red (R) channel or color and may represent the red channel of the RGB color scheme. The color channel with wavelengths between the red and blue color channels may be referred to as the green (G) channel or color, and may represent the green channel of the RBG color scheme. The blue light or color channel may correspond to wavelength about 500 nm or shorter, the red light or color channel may correspond to wavelength about 625 nm or longer, and the green light or color channel may correspond to a wavelength range 500 nm to 565 nm. It will be appreciated however that the embodiments described herein may be adapted for use with polychromatic light comprised of any combination of two or more, or preferably three or more color channels, which may represent different portions of a relevant optical spectrum.
An aspect of the present disclosure relates to a display system comprising a waveguide and an image light source coupled thereto, wherein the waveguide is configured to receive image light emitted by the image light source and to convey the image light received in a field of view (FOV) of the waveguide to an eyebox for presenting to a user. The term “field of view” (FOV), when used in relation to a display system, may relate to an angular range of light propagation supported by the system or visible to the user. A two-dimensional (2D) FOV may be defined by angular ranges in two orthogonal planes. For example, a 2D FOV of a NED device may be defined by two one-dimensional (1D) FOVs, which may be a vertical FOV, for example +/−20° relative to a horizontal plane, and a horizontal FOV, for example +/−30° relative to the vertical plane. With respect to a FOV of a NED, the “vertical” and “horizontal” planes or directions may be defined relative to the head of a standing person wearing the NED. Otherwise the terms “vertical” and “horizontal” may be used in the present disclosure with reference to two orthogonal planes of an optical system or device being described, without implying any particular relationship to the environment in which the optical system or device is used, or any particular orientation thereof to the environment.
An aspect of the present disclosure relates to a waveguide for conveying image light to an eyebox, the waveguide comprising a substrate of a first refractive index n1, and a high-index layer of a second refractive index n2 supported by the substrate without an air gap therebetween, wherein n2 is greater than n1. The waveguide may further comprise an input coupler configured to couple the image light into the high-index layer, and an output coupler configured to couple the image light out of the waveguide toward the eyebox. The output coupler may comprise a first grating and a second grating, at least one of which configured to redirect light propagating in the high-index layer.
In some implementations the input coupler may comprise an input grating disposed to couple at least a portion of the image light into the high-index layer. In some implementations each one of the first and the second gratings may be disposed in the high-index layer or at a surface thereof to diffract light propagating in the high-index layer. In some implementations the first grating may be disposed at a surface of the high-index layer, and the second grating may be disposed at an interface between the substrate and the high-index layer. In some implementations the output coupler may comprise a third grating configured to redirect light propagating in the substrate. In some implementations the third grating may be disposed at a surface of the substrate opposite of the high-index layer.
In some implementations, the second refractive index is at least 2.2. In some implementations, an index difference (n1-n2) may be equal to or greater than 0.3. In some implementations, an index difference (n1-n2) may be at least 0.4. In some implementations, an the high-index layer may be at least 150 microns thick.
In some implementations, the waveguide may be configured for the image light comprising a first color channel and a second color channel, wherein the second color channel comprises longer wavelengths than the first color channel, and the input coupler may be configured to trap at least a portion of the second color channel in the high-index layer by means of total internal reflection (TIR) at an interface between the high-index layer and the substrate.
In some implementations, the waveguide is configured to provide a field of view (FOV) shared by the first and second color channels, the FOV having an angular FOV width Θ in at least one direction, and wherein the high-index layer supports at least 50% of the angular FOV width Θ of at least one of the first and second color channels.
In some implementations, the first color channel comprises one of green or blue light and the second color channel comprises red light, and the input coupler may be configured to trap at least a portion of the red light in the high-index layer by means of TIR at the interface between the high-index layer and the substrate. In some implementations the first color channel may comprise blue light and the second color channel comprises green light, and the input coupler may be configured to trap at least a portion of the green light in the high-index layer by means of TIR at the interface between the high-index layer and the substrate. In some implementations the input coupler may be configured to direct at least a portion of the first color channel into the substrate through the interface of the high-index layer with the substrate.
In some implementations, the image light may further comprise a third color channel, and the input coupler may be configured to couple all three color channels into the waveguide for propagating toward the output coupler.
In some implementations, the first grating and the second grating may cooperate for diffracting the image light trapped in the high-index layer out of the waveguide at an output angle equal to an angle of incidence thereof upon the waveguide.
In some implementations, the waveguide may comprise a third grating disposed in the substrate or at a surface thereof, and at least one of the first grating or the second grating may cooperate with the third grating for diffracting the image light propagating in the substrate out of the waveguide at an output angle equal to an angle of incidence thereof upon the waveguide.
An aspect of the present disclosure relates to a near-eye display (NED) device comprising: a support structure for wearing on a head of a user; a light projector carried by the support structure and configured to emit image light comprising a plurality of color channels; and, a first waveguide carried by the support structure and configured to convey at least a first color channel and a second color channel of the image light from the light projector to an eyebox. The first waveguide may comprise a substrate of a first refractive index n1 and a high-index layer of a second refractive index n2 supported by the substrate, wherein n2 is greater than n1. The waveguide may further comprise an input coupler configured to couple the image light into the first waveguide, and an output coupler configured to couple the image light out of the first waveguide toward the eyebox. The output coupler may comprise a first grating configured to redirect light propagating in the high-index layer and a second grating configured to redirect light propagating in the high-index layer.
In some implementations of the NED device, the first waveguide may be further configured to convey a third color channel of the image light from the light projector to an eyebox. Some implementations of the NED device may include a second waveguide configured to convey a third color channel of the image light from the light projector to an eyebox.
An aspect of the present disclosure provides a waveguide for conveying image light in a display system, the waveguide comprising: a waveguide body having two outer surfaces and a thickness therebetween, the waveguide body comprising an input area and an output area, the waveguide body configured to guide the image light received at the input area toward the output area, wherein the waveguide body has a refractive index that varies in a direction of the thickness. An input coupler may be disposed in the input area and configured to couple the image light into the waveguide body for propagating toward the output area. An output coupler may be disposed in the output area and configured to couple the image light out of the waveguide body for propagating toward a viewing area.
In some implementations the refractive index may be greater at one of the outer surfaces than at the other of the outer surfaces. In some implementations the refractive index may be greater in a middle portion of the waveguide body between the outer surfaces than in portions of the waveguide body adjacent to the outer surfaces.
Example embodiments of the present disclosure will now be described with reference to a waveguide display. Generally a waveguide display may include an image light source such as a pixelated electronic display or a scanning projector assembly, a controller, and an optical waveguide configured to transmit image light from the image light source to an exit pupil for presenting images to a user. The image light source may also be referred to herein as a display projector, an image projector, or simply as a projector. Example display systems incorporating a display waveguide wherein features and approaches disclosed here may be used, include, but not limited to, a near-eye display (NED), a head-up display (HUD), a head-down display, and the like.
With reference to
The image light source 110, which may be referred to herein as projector 110, is configured to emit image light 111. In some embodiments the image light source 110 may be in the form of, or include, a scanning projector. In some embodiments the scanning projector may include a light source, such as but not limited to a laser diode (LD) or a light-emitting diode (LED), and one or more scanning reflectors. In some embodiments the scanning projector may include a scanning light source. In some embodiments the image light source 110 may include a pixelated micro-display, such as for example but not limited to a liquid crystal display (LCD), an organic light emitting display (OLED), an inorganic light emitting display (ILED), an active-matrix organic light-emitting diode (AMOLED) display, or a transparent organic light emitting diode (TOLED) display. In some embodiment the image light source 110 may include a linear array of light sources, such LEDs, LDs, or the like. In some embodiments it may include a 2D pixel array, and each pixel may be configured to emit polychromatic light. The image light source 110 may further include one or more optical components configured to suitably condition the image light. This may include, without limitation, expanding, collimating, correcting for aberrations, and/or adjusting the direction of propagation of the image light, or any other suitable conditioning as may be desired for a particular system and electronic display. The one or more optical components in the optics block may include, without limitations, one or more lenses, mirrors, apertures, gratings, or a combination thereof. In some embodiments the optics block of the image light source 110 may include one or more adjustable elements operable to scan the beam of light with respect to its propagation angle.
The waveguide 120 may comprise a waveguide body 123, an input coupler 130 in an input area of the waveguide, and an output coupler 140 in an output area of the waveguide. In some embodiments a waveguide stack composed of two or more waveguides that are stacked one over another may be used in place of the waveguide 120. The input coupler 130 may be disposed at a location where it can receive the image light 111 from the image light source 110. The input coupler 130, which may also be referred to herein as the in-coupler 130, is configured to couple the image light 111 into the waveguide 120, where it propagates toward the output coupler 140. The output coupler 140, which may also be referred to herein as the out-coupler, may be offset from the input coupler 130 and configured to de-couple the image light from the waveguide 120 for propagating in a desired direction, such as for example toward a user's eye 166. The out-coupler 140 may be greater in size than the in-coupler 130 to expand the image beam in size as it leaves the waveguide, and to support a larger exit pupil than that of the projector 110. In some embodiments the waveguide body 123 may be partially transparent to outside light, and may be used in AR applications. The waveguide 120 may be configured to convey a two-dimensional (2D) FOV from the input coupler 130 to the output coupler 140, and ultimately to the eye 166 of the user. Here and in the following description the display waveguide 120 and embodiments thereof may be described with reference to a Cartesian coordinate system (x,y,z), in which the (x,y) plane is parallel to the outer faces of the waveguide through which the waveguide receives and/or outputs the image light, and the z-axis is orthogonal thereto. In some embodiments the 2D FOV of waveguide 120 may be defined by a 1D FOV in the (y,z) plane and a 1D FOV in the (x,z) plane, which may also be referred to as the vertical and horizontal FOVs, respectively.
An in-coupler 230 may be provided in an input area 203 of the waveguide 210 and may be in the form of one or more diffraction gratings. An out-coupler 240, which may also be in the form of one or more diffraction gratings, may be disposed in an output area 209 of the waveguide, and may be laterally offset from the in-coupler 230, for example along the y-axis. In the illustrated embodiment the out-coupler 240 is located at the same face 211 of the waveguide 210 as the in-coupler 130, but in other embodiments it may be located at the opposite face 212 of the waveguide. Some embodiments may have two input gratings that may be disposed at main outer faces 211, 212 of the waveguide, and/or two output gratings that may be disposed at main outer faces 211, 212 of the waveguide, or superimposed at a same face or at a same plane within the waveguide. The gratings embodying couplers 230, 240 may be any suitable diffraction gratings, including volume and surface-relief gratings, such as for example blaze gratings. The gratings may also be volume holographic gratings. In some embodiments they may be formed in the material of the waveguide itself. In some embodiments they may be fabricated in a different material or materials that may be affixed to a face or faces of the waveguide at desired locations.
The in-coupler 230 may be configured to support an input FOV 234, which may also be referred to herein as the acceptance angle. The input FOV 234, which depends on wavelength, defines a range of angles of incidence α for which the light incident upon the in-coupler 230 is coupled into the waveguide and propagates toward the out-coupler 240. In the context of this specification, “coupled into the waveguide” means coupled into the guided modes of the waveguide or modes that have suitably low radiation loss. Light coupled into the waveguide that experiences total internal reflection (TIR) upon the waveguide's outer surfaces 211 and 212 may propagate within the waveguide with suitably low attenuation until it is redirected by an out-coupler. Thus waveguide 210 may trap light of a particular wavelength λ by means of TIR, and guide the trapped light toward the out-coupler 240, provided that the angle of incidence of the light upon the in-coupler 230 from the outside of the waveguide is within the input FOV 234 of the waveguide 210. The input FOV 234 of the waveguide is determined at least in part by a pitch p of the in-coupler grating 230 and by the refractive index n of the waveguide. For a given grating pitch p, the first-order diffraction angle β of the light incident upon the grating 230 from the air at an angle of incidence α in the (y, z) plane may be found from a diffraction equation (1):
n·sin(β)−sin(α)=λ/p. (1)
Here the angle of incidence α and the diffraction angle β are positive if the corresponding wavevectors have components directed toward the out-coupler 240. Diffraction angle β defines the angle of propagation of the diffracted ray of the image light in the waveguide, and may also be referred to herein as the propagation angle. Equation (1) may be easily modified for embodiments in which light enters the waveguide 210 from a material with refractive index nc>1. Equation (1) holds for rays of image light with a plane of incidence normal to the groves of the in-coupler grating, i.e. when the plane of incidence of image light includes the grating vector of the in-coupler. In the illustrated example, the grating vector of the in-coupler may be directed along the y-axis.
Light experiencing TIR in a waveguide or a layer may be referred to herein as the in-coupled light or trapped light. The TIR condition for the diffracted light within the waveguide may be defined by equation (2):
n·sin(β)≥1, (2)
where the equality corresponds to a TIR angle βc=a sin (1/n). The input FOV 234 of waveguide 210 spans between a first FOV angle of incidence α1 and a second FOV angle of incidence α2, which may be referred to herein as the FOV angles. The first FOV angle of incidence α1 corresponding to the right-most incident ray 111b in
The second FOV angle of incidence α2, corresponding to the left-most incident ray 111a in
The width w=|α1-α2| of the input 1D FOV of the waveguide 210 at a particular wavelength can be estimated from equations (3) and (4). Generally the input FOV of a waveguide increases as the refractive index of the waveguide increases relative to that of the surrounding media. By way of example, for a substrate of index n surrounded by air and for βmax=75°, λ/p=1.3, the width w of the input 1D FOV of the waveguide for monochromatic light may be about 26° for n=1.5, about 43° for n=1.8, and about 107° for n=2.4.
As can be seen from equations (3) and (4), the input FOV 234 of waveguide 210 is a function of the wavelength λ of input light, so that the input FOV 234 shifts its position in the angle space as the wavelength changes; for example, it shifts towards the out-coupler 240 as the wavelength increases. Thus it can be challenging to provide a sufficiently wide FOV for polychromatic image light with a single waveguide.
Referring to
In some embodiments the gratings embodying the in-coupler 230 and the out-coupler 240 may be configured so that the vector sum of their grating vectors gi is equal to substantially zero:
|Σgi|=0 (5)
Here the summation in the left hand side (LHS) of equation (5) is performed over grating vectors gi of all gratings that cooperate to diffract the input light traversing the waveguide, including the one or more gratings of the in-coupler 230, and the one or more gratings of the out-coupler 230. A grating vector gi is a vector that is directed normally to the equal-phase planes of the grating, i.e. its “grooves”, and which magnitude is inversely proportional to the grating pitch p, |gi|=2π/p. Under conditions of equation (5), rays of the image light exit the waveguide by means of the out-coupler 240 at the same angle at which they entered the in-coupler 230, provided that the waveguide 210 is an ideal slab waveguide with parallel outer faces 211, 212, and the FOV of the waveguide is defined by its input FOV. In practical implementations the equation (5) will hold with some accuracy, within an error threshold that may be allowed for a particular display system. In an example embodiment with a single 1D input grating and a 1D output grating, the grating pitch of the out-coupler 240 may be substantially equal to the grating pitch of the in-coupler 230.
Kx=n sin(θx), and Ky=n sin(θy). (6)
Here n is the refractive index of the substrate where in-coupled light is propagating, and the angles θx and θy define the direction of light propagation in the plane of the waveguide (x,y) in projection on the x-axis and y-axis, respectively. These angles may also represent the coordinates of angle space in which a 2D FOV of the waveguide may be defined. The (Kx, Ky) plane may be referred to herein as the K-space, and the normalized wavevector K=(Kx, Ky) as the in-plane K-vector or simply as the K-vector.
In the K-space, the in-coupled light may be graphically represented by a TIR ring 500. The TIR ring 500 is an area of the K-space bounded by a TIR circle 501 and a maximum-angle circle 502, both circles centered at K0=(0, 0) corresponding to the normal incidence upon the waveguide. The TIR circle 501 corresponds to the TIR angle βc. The maximum-angle circle 502 corresponds to a maximum propagation angle βmax for in-coupled light. States within the TIR circle 501 represent uncoupled light, i.e. the in-coming light that is incident upon the in-coupler 430, or the light coupled out of the waveguide by the out-coupler gratings 441 and/or 442. With the normalization, the radius rTIR of the TIR circle 501 and the radius rmax of the outer circle 502 may be defined by the following equations:
rTIR=1,rmax=n·sin(βmax) (7)
The greater the refractive index n, the wider is the TIR ring 500 and the broader is the angular range of input light of a wavelength λ that can be coupled into the waveguide.
Arrows labeled g0, g1, and g2 in
The position, size, and shape of each partial FOV 520, 530 in the angle space, and thus the full 2D FOV of the waveguide, depends on the wavelength λ of the input light, on the ratios of pitches p0, p1, and p2 of the input and output gratings to the wavelength of incoming light λ, and on the relative orientation of the gratings. Thus, the 2D FOV of the waveguide may be suitably shaped and positioned in the angle space for a particular color channel or channels by selecting the pitch sizes and the relative orientation of the gratings. In some embodiments of waveguide 410, the output gratings 441, 442 may have the same pitch, p1=p2 and be symmetrically oriented relative to the input grating. In such embodiments the grating vectors g1, g2 of the first and second output gratings may be oriented at angles of +/−ϕ relative to the grating vector g0 of the in-coupler. By way of non-limiting example, the grating orientation angle ϕ may be in the range of 50 to 70 degrees, for example 60 to 66 degrees, and may depend on the refractive index of the waveguide.
In some embodiments, a single waveguide formed of an optically transparent high-index substrate may be used in a display system to convey multiple color channels of RGB light from an image source to a viewing area of a waveguide display, such as an eyebox of a, NED. In some embodiments the same input and output gratings may be used for at least two color channels of the image light, for example for at least two of the Red, Green, and Blue RGB color channels, or for all three RGB color channels. The desirability of having a high refractive index n may be understood by noting that in the wavelength-normalized K-space the length of each grating vector scales with the wavelength, i.e. gi=λ/pi, where pi is the pitch of the i-th grating, i=0, 1, or 2. As the width of the TIR ring 500 is proportional to the refractive index n, greater values of the refractive index enable broader polychromatic FOV, i.e. the common FOV that is shared by two or more color channels of the image light.
In some embodiments a maximum y-axis width (2α) of a symmetrical FOV shared by the two wavelengths, FOVy=(−α,+α), may be estimated from equations (8) and (9).
1+sin(α)=λ1/p0 (8)
n·sin(βmax)−sin(α)=λ2/p0 (9)
Equation (8) represents a condition that the shorter of the grating vectors 531, 532 is long enough to reach the TIR circle 501 from a state A1 of the FOV that is farthest from the TIR circle 501 in the direction of the grating vector. Equation (9) represents a condition that a K-state A2 at an opposite end of the FOV is far enough from the outer boundary 502 of the TIR ring 500 that the longer of the two grating vectors 531, 532 doesn't extend beyond it. These conditions provide an estimate for the pitch p0 of the in-coupler grating (equation (10)) and an estimate of the corresponding half-width α of the common FOV of the two wavelengths along the y-axis (equation 11):
An estimated width 2α of the shared 1D FOV for wavelengths λ1 and λ2 increases as the refractive index n of the waveguide increases above a minimum value nmin, which in some embodiments may be estimated as nmin=λ2/λ1 sin(βmax). By way of example the longer wavelength λ2 may correspond to red light, with the wavelength e.g. of 635 nm, while the shorter wavelength λ1 may correspond to blue light, with the wavelength e.g. of 465 nm, resulting in a minimal value of n of about 1.4 for a waveguide configured to transmit all three color channels of RGB light. According to Equation (11), in one embodiment an estimated width 2α of a symmetrical 1D FOV of a single one-layer waveguide that may be shared by all three channels of RGB light may be about 30 degrees for n=2.0, about 40 degrees for n=2.2, and about 63 degrees for n=2.6.
Equation (11) provides an estimate of a 1D FOV that is centered at normal incidence and may be supported by a single waveguide for polychromatic light with wavelengths from λ1 to λ2. A 2D FOV that the waveguide supports for polychromatic light at its output, e.g. at the eyebox, may further depend on the out-coupler, such as the number and configuration of the output gratings.
In some embodiments, a single waveguide made of optically transparent high-index material with the refractive index of about 2.3, or preferably 2.4 or greater may be used in a display system to convey RGB light from an image source to an eyebox of a NED. In some embodiments, a NED may transmit image light with a single waveguide made of optically transparent high-index material with the refractive index of at least 2.5. In some embodiments, a NED may transmit image light with a single waveguide made of optically transparent high-index material with the refractive index of at least 2.6.
Referring now to
Referring to
Examples of possible material that may be used for the HI layer 710 include LiNbO3, Ti02, GaN, AlN, SiC, CVD diamond, ZnS. The thickness of the HI layer 710 and the substrate 720 may vary depending on their indices and/or design goals. By way of non-limiting example, the thickness d1 of the HI layer 710 may be in the range of 150 to 400 μm, for example about 300 μm. The substrate thickness d2 may be, for example, between 200 and 600 μm. Embodiments with the layer and substrate thickness outside of these ranges may also be envisioned.
Continuing to refer to
Image light 711 may include a first color channel and a second color channel, with the second color channel comprising longer wavelengths than the first color channel. The first color channel may be indicated with a letter “B” and referred to as the blue color channel or blue light, and the second color channel may be indicated with a letter “R” and referred to as the red color channel or the red light. In some embodiments the first color channel and the second color channel may correspond to the “B” and “R” color channels of RGB light. The HI layer 710 may be configured to trap at least a portion of the red color channel of image light 711 by means of TIR at the interface 702 with substrate 720, while allowing the blue color channel of the image light to propagate into the substrate 720, where it can be trapped by TIR at the second outer surface 703 thereof. This is schematically illustrated by ray 711R of the red color channel of image light 711 being trapped in the HI layer 710, and ray 711B of the blue color channel of image light 711 propagating into the substrate 720 and experiencing TIR at the surface 703 thereof. Due to the higher refractive index of HI layer 710 relative to substrate 720, the propagation angle β1 of the red ray 711R in the HI layer 710 and the propagation angle of the blue ray 711B within substrate 720 may be substantially closer to each other than their propagation angles in the HI layer 710 or the propagation angles of rays 611R and 611B in the waveguide 600 of
The coupling of image light 711 of wavelength λ in the HI layer 710 by the input grating 730 may be described using equation (1) with n=n1 and p=p0, where p0 is the pitch of the input grating 730. A TIR condition on the interface 702 may be expressed as
n1 sin(β1)≥n2 (12)
From equations (1) and (12), a first specific angle of incidence α12 of image light 711 upon the input grating 730, i.e. the angle of incidence at which the image light of wavelength λ experiences TIR at interface 702 may be estimated from the following equation (13):
Rays of image light 711 that are incident upon the waveguide at smaller angles, α<α12, may propagate into the substrate 720 and may experience TIR at the outer surface 703 thereof. Rays incident at a slightly greater angle than α12 will be partially reflected back into the HI layer 710 and partially propagate into the substrate 720 at a “glancing” angle as illustrated at 713, with the reflected fraction the greater the closer to the TIR condition at the interface 702, and the transmitted fraction vanishing at the TIR condition.
A second specific angle of incidence α23 upon waveguide 700, i.e. the smallest angle of incidence for which the image light of wavelength λ experiences TIR at the outer surface 703, may be estimated from the following equation (14):
A limitation on a maximum propagation angle β2max of the in-coupled light in substrate 720 yields a third specific angle of incidence α3, which may be estimated from the following equation (15):
A limitation on a maximum propagation angle β1max of the light trapped in the HI layer 710 yields a fourth specific angle of incidence α4, which may be estimated from the following equation (16):
In
Light propagating in the HI layer 710 may be diffracted out of the waveguide by out-coupler gratings 741 and 742, which may be generally disposed anywhere within the HI layer or at a surface thereof. The out-coupler gratings 741 and 742 may be configured so that the sum (g1+g2) of their grating vectors g1 and g2 equals to (−g0), where g0 is the grating vector of the in-coupler 730, so that successive diffraction from these two gratings de-couples the in-coupled light from the waveguide in the direction of its incidence upon the waveguide. In-coupled light that penetrates into the substrate 720 may be out-coupled from the waveguide by diffraction upon some combination of two or more of the first grating 741, the second grating 742, and an optional third grating 743 when present. In some embodiments the third out-coupler grating 743 may be configured so that the vector sum of its grating vector g3 with the grating vector g1 of the first grating 741 and the grating vector g0 of the in-coupler 730 is substantially zero, so that a successive diffraction of the in-coupled light that propagates partly in the substrate 720 from the first and third gratings 741, 743 out-couples that light in the direction of its incidence upon the waveguide. In some embodiment the second grating 742 and the third grating 743 may have equal grating vectors g3=g2. In some embodiments the third grating 743 may be absent, and the in-coupled light that propagates partly in the substrate 720 may be out-coupled from the waveguide by successive diffractions from the first grating 741 and the second grating 742. In some embodiments a fourth grating 744 with a grating vector g4 may be provided to decouple the substrate-penetrating in-coupled light from the waveguide in cooperation with one of the first grating 741, the second grating 742, or the third grating 743. The fourth grating 744 may be disposed for example at the outer substrate surface 703, where it may be superimposed or stacked with the third grating 743. In some embodiments the grating vector g4 of the fourth grating 744 may be equal to the grating vector g1 of the first grating 741.
The example multi-layer waveguide 700 illustrated in
Referring to
The central HI layer 922 is configured to trap the “glancing” ray 952 by TIR within itself, generally in a way similar to that described above with reference to
Turning now to
Turning now to
Embodiments described above with reference to
Embodiments described above implement a method for conveying image light from an image light source to an eye box with a waveguide by selectively confining a first portion of the image light in a smaller volume of the waveguide's body than a second portion of the image light for equalizing conditions of their de-cooping out of the waveguides.
In some embodiments the method may include: directing the image light onto an input area of a waveguide comprising two opposed outer surfaces and a high-index portion therebetween, the high-index portion extending from the input area of the waveguide to an output area thereof, the high-index portion having a greater refractive index than a portion of the waveguide body adjacent to one of the opposed outer surfaces thereof; coupling the image light into the waveguide body at a range of propagation angles comprising a first propagation angle and a second propagation angle; and, propagating the image light in the waveguide from the input area to the output area so that first rays of the image light coupled into the waveguide at the first propagation angle are guided toward the output area by total internal reflection at the opposed outer surfaces of the waveguide, while second rays of the image light coupled into the waveguide at the second propagation angle propagate toward the output area within the high-index portion of the waveguide body.
In some embodiments the method may include: directing the image light onto an input area of a waveguide comprising two opposed outer surfaces and a middle portion therebetween, the middle portion extending from the input area of the waveguide to an output area thereof, the middle portion having a greater refractive index than portions of the waveguide body adjacent to the opposed outer surfaces thereof; coupling the image light into the waveguide body at a range of propagation angles comprising a first propagation angle and a second propagation angle; and, propagating the image light in the waveguide from the input area to the output area so that first rays of the image light coupled into the waveguide at the first propagation angle are guided toward the output area by total internal reflection at the opposed outer surfaces of the waveguide, while second rays of the image light coupled into the waveguide at the second propagation angle propagate toward the output area within the middle portion of the waveguide body.
Referring to
Embodiments of the present disclosure may include, or be implemented in conjunction with, an artificial reality system. An artificial reality system adjusts sensory information about outside world obtained through the senses such as visual information, audio, touch (somatosensation) information, acceleration, balance, etc., in some manner before presentation to a user. By way of non-limiting examples, artificial reality may include virtual reality (VR), augmented reality (AR), mixed reality (MR), hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include entirely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, somatic or haptic feedback, or some combination thereof. Any of this content may be presented in a single channel or in multiple channels, such as in a stereo video that produces a three-dimensional effect to the viewer. Furthermore, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to, for example, create content in artificial reality and/or are otherwise used in (e.g., perform activities in) artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including a wearable display such as an HMD connected to a host computer system, a standalone HMD, a near-eye display having a form factor of eyeglasses, a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Referring to
In some embodiments, the front body 1302 includes locators 1308 and an inertial measurement unit (IMU) 1310 for tracking acceleration of the HMD 1300, and position sensors 1312 for tracking position of the HMD 1300. The IMU 1310 is an electronic device that generates data indicating a position of the HMD 1300 based on measurement signals received from one or more of position sensors 1312, which generate one or more measurement signals in response to motion of the HMD 1300. Examples of position sensors 1312 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, a type of sensor used for error correction of the IMU 1310, or some combination thereof. The position sensors 1312 may be located external to the IMU 1310, internal to the IMU 1310, or some combination thereof.
The locators 1308 are traced by an external imaging device of a virtual reality system, such that the virtual reality system can track the location and orientation of the HMD 1300. Information generated by the IMU 1310 and the position sensors 1312 may be compared with the position and orientation obtained by tracking the locators 1308, for improved tracking accuracy of position and orientation of the HMD 1300. Accurate position and orientation is important for presenting appropriate virtual scenery to the user as the latter moves and turns in 3D space.
The HMD 1300 may further include a depth camera assembly (DCA) 1311, which captures data describing depth information of a local area surrounding some or all of the HMD 1300. To that end, the DCA 1311 may include a laser radar (LIDAR), or a similar device. The depth information may be compared with the information from the IMU 1310, for better accuracy of determination of position and orientation of the HMD 1300 in 3D space.
The MD 1300 may further include an eye tracking system for determining orientation and position of user's eyes in real time. The determined position of the user's eyes allows the HMD 1300 to perform (self-) adjustment procedures. The obtained position and orientation of the eyes also allows the MD 1300 to determine the gaze direction of the user and to adjust the image generated by the display system 1380 accordingly. In one embodiment, the vergence, that is, the convergence angle of the user's eyes gaze, is determined. The determined gaze direction and vergence angle may also be used for real-time compensation of visual artifacts dependent on the angle of view and eye position. Furthermore, the determined vergence and gaze angles may be used for interaction with the user, highlighting objects, bringing objects to the foreground, creating additional objects or pointers, etc. An audio system may also be provided including e.g. a set of small speakers built into the front body 1302.
Referring to
As described above with reference to
The I/O interface 1315 is a device that allows a user to send action requests and receive responses from the console 1390. An action request is a request to perform a particular action. For example, an action request may be an instruction to start or end capture of image or video data or an instruction to perform a particular action within an application. The I/O interface 1315 may include one or more input devices, such as a keyboard, a mouse, a game controller, or any other suitable device for receiving action requests and communicating the action requests to the console 1390. An action request received by the I/O interface 1315 is communicated to the console 1390, which performs an action corresponding to the action request. In some embodiments, the I/O interface 1315 includes an IMU that captures calibration data indicating an estimated position of the I/O interface 1315 relative to an initial position of the I/O interface 1315. In some embodiments, the I/O interface 1315 may provide haptic feedback to the user in accordance with instructions received from the console 1390. For example, haptic feedback can be provided when an action request is received, or the console 1390 communicates instructions to the I/O interface 1315 causing the I/O interface 1315 to generate haptic feedback when the console 1390 performs an action.
The console 1390 may provide content to the HMD 1300 for processing in accordance with information received from one or more of: the IMU 1310, the DCA 1311, the eye tracking system 1325, and the I/O interface 1315. In the example shown in
The application store 1355 may store one or more applications for execution by the console 1390. An application is a group of instructions that, when executed by a processor, generates content for presentation to the user. Content generated by an application may be in response to inputs received from the user via movement of the HMD 1300 or the I/O interface 1315. Examples of applications include: gaming applications, presentation and conferencing applications, video playback applications, or other suitable applications.
The tracking module 1360 may calibrate the AR/VR system 1350 using one or more calibration parameters and may adjust one or more calibration parameters to reduce error in determination of the position of the HMD 1300 or the I/O interface 1315. Calibration performed by the tracking module 1360 also accounts for information received from the IMU 1310 in the HMD 1300 and/or an IMU included in the I/O interface 1315, if any. Additionally, if tracking of the HMD 1300 is lost, the tracking module 1360 may re-calibrate some or all of the AR/VR system 1350.
The tracking module 1360 may track movements of the HMD 1300 or of the I/O interface 1315, the IMU 1310, or some combination thereof. For example, the tracking module 1360 may determine a position of a reference point of the HMD 1300 in a mapping of a local area based on information from the HMD 1300. The tracking module 1360 may also determine positions of the reference point of the HMD 1300 or a reference point of the I/O interface 1315 using data indicating a position of the HMD 1300 from the IMU 1310 or using data indicating a position of the I/O interface 1315 from an IMU included in the I/O interface 1315, respectively. Furthermore, in some embodiments, the tracking module 1360 may use portions of data indicating a position or the HMD 1300 from the IMU 1310 as well as representations of the local area from the DCA 1311 to predict a future location of the HMD 1300. The tracking module 1360 provides the estimated or predicted future position of the HMD 1300 or the I/O interface 1315 to the processing module 1365.
The processing module 1365 may generate a 3D mapping of the area surrounding some or all of the HMD 1300 (“local area”) based on information received from the HMD 1300. In some embodiments, the processing module 1365 determines depth information for the 3D mapping of the local area based on information received from the DCA 1311 that is relevant for techniques used in computing depth. In various embodiments, the processing module 1365 may use the depth information to update a model of the local area and generate content based in part on the updated model.
The processing module 1365 executes applications within the AR/VR system 1350 and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the HMD 1300 from the tracking module 1360. Based on the received information, the processing module 1365 determines content to provide to the HMD 1300 for presentation to the user. For example, if the received information indicates that the user has looked to the left, the processing module 1365 generates content for the HMD 1300 that mirrors the user's movement in a virtual environment or in an environment augmenting the local area with additional content. Additionally, the processing module 1365 performs an action within an application executing on the console 1390 in response to an action request received from the I/O interface 1315 and provides feedback to the user that the action was performed. The provided feedback may be visual or audible feedback via the HMD 1300 or haptic feedback via the I/O interface 1315.
In some embodiments, based on the eye tracking information (e.g., orientation of the user's eyes) received from the eye tracking system 1325, the processing module 1365 determines resolution of the content provided to the HMD 1300 for presentation to the user with the image projector(s) 1314. The processing module 1365 may provide the content to the HMD 1300 having a maximum pixel resolution in a foveal region of the user's gaze. The processing module 1365 may provide a lower pixel resolution in the periphery of the user's gaze, thus lessening power consumption of the AR/R system 1350 and saving computing resources of the console 1390 without compromising a visual experience of the user. In some embodiments, the processing module 1365 can further use the eye tracking information to adjust where objects are displayed for the user's eye to prevent vergence-accommodation conflict and/or to offset optical distortions and aberrations.
The hardware used to implement the various illustrative logics, logical blocks, modules, and circuits described in connection with the aspects disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
The present disclosure is not to be limited in scope by the specific embodiments described herein. Indeed, other various embodiments and modifications, in addition to those described herein, will be apparent to those of ordinary skill in the art from the foregoing description and accompanying drawings. Thus, such other embodiments and modifications are intended to fall within the scope of the present disclosure. Further, although the present disclosure has been described herein in the context of a particular implementation in a particular environment for a particular purpose, those of ordinary skill in the art will recognize that its usefulness is not limited thereto and that the present disclosure may be beneficially implemented in any number of environments for any number of purposes. Accordingly, the claims set forth below should be construed in view of the full breadth and spirit of the present disclosure as described herein.
Patent | Priority | Assignee | Title |
11163106, | Aug 29 2019 | Chicony Power Technology Co., Ltd.; CHICONY POWER TECHNOLOGY CO , LTD | Multilayer opto-electronic module |
Patent | Priority | Assignee | Title |
10185151, | Dec 20 2016 | META PLATFORMS TECHNOLOGIES, LLC | Waveguide display with a small form factor, a large field of view, and a large eyebox |
10393930, | Jun 30 2017 | Microsoft Technology Licensing, LLC | Large-field-of-view waveguide supporting red, green, and blue in one plate |
10437064, | Jan 12 2015 | DIGILENS INC | Environmentally isolated waveguide display |
10444510, | Oct 11 2016 | META PLATFORMS TECHNOLOGIES, LLC | Opposed gratings in a waveguide display |
10539799, | Sep 07 2016 | CITIBANK, N A | Virtual reality, augmented reality, and mixed reality systems including thick media and related methods |
10598938, | Nov 09 2018 | META PLATFORMS TECHNOLOGIES, LLC | Angular selective grating coupler for waveguide display |
10670876, | Aug 08 2014 | DIGILENS INC | Waveguide laser illuminator incorporating a despeckler |
10795235, | Jan 30 2017 | The Charles Stark Draper Laboratory, Inc | SAW modulators and light steering methods |
10884244, | May 14 2013 | Seiko Epson Corporation | Display apparatus |
10895671, | Jan 23 2018 | META PLATFORMS TECHNOLOGIES, LLC | Diffraction grating with a variable refractive index using ion implantation |
5856842, | Aug 26 1997 | Kaiser Optical Systems Corporation | Apparatus facilitating eye-contact video communications |
6882479, | Dec 15 2000 | SAMSUNG ELECTRONICS CO , LTD | Wearable display system |
7710655, | Nov 21 2005 | Microvision, Inc | Display with image-guiding substrate |
8548290, | Aug 23 2011 | Vuzix Corporation | Dynamic apertured waveguide for near-eye display |
8837050, | Apr 05 2011 | Microvision, Inc.; Microvision, Inc | Optical wedge redirection apparatus and optical devices using same |
9727772, | Jul 31 2013 | DIGILENS, INC | Method and apparatus for contact image sensing |
9804334, | Oct 08 2015 | Teramount Ltd.; TERAMOUNT LTD | Fiber to chip optical coupler |
9933684, | Nov 16 2012 | DIGILENS INC | Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration |
20030043157, | |||
20070052959, | |||
20100277803, | |||
20100321781, | |||
20160116739, | |||
20160252724, | |||
20170219841, | |||
20170248747, | |||
20170299860, | |||
20170307800, | |||
20180031752, | |||
20180074457, | |||
20180084245, | |||
20180120559, | |||
20180172995, | |||
20180227576, | |||
20180275350, | |||
20190056591, | |||
20190094652, | |||
20190121142, | |||
20190285796, | |||
20200292840, | |||
20200393682, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jan 10 2020 | Facebook Technologies, LLC | (assignment on the face of the patent) | / | |||
Jan 13 2020 | LEE, HEE YOON | Facebook Technologies, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 051714 | /0436 | |
Mar 18 2022 | Facebook Technologies, LLC | META PLATFORMS TECHNOLOGIES, LLC | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 062444 | /0855 |
Date | Maintenance Fee Events |
Jan 10 2020 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Date | Maintenance Schedule |
Jul 06 2024 | 4 years fee payment window open |
Jan 06 2025 | 6 months grace period start (w surcharge) |
Jul 06 2025 | patent expiry (for year 4) |
Jul 06 2027 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 06 2028 | 8 years fee payment window open |
Jan 06 2029 | 6 months grace period start (w surcharge) |
Jul 06 2029 | patent expiry (for year 8) |
Jul 06 2031 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 06 2032 | 12 years fee payment window open |
Jan 06 2033 | 6 months grace period start (w surcharge) |
Jul 06 2033 | patent expiry (for year 12) |
Jul 06 2035 | 2 years to revive unintentionally abandoned end. (for year 12) |