motion-sensing display apparatuses supported near a user's eye including partially transparent screens at least partially disposed within the user's field of vision, image generators positioned to display an image on a first side of the screen, motion capture devices positioned near the screen and configured to capture a user gesture occurring beyond the screen in the user's field of vision, and processors in data communication with the image generator and the motion capture device, the processors configured to execute computer executable instructions in response to the user gesture. In some examples, motion-sensing display apparatuses include cameras. In some further examples, image generators display user interfaces on screens.
|
1. A motion-sensing display apparatus supported near a user's eye, the motion-sensing display apparatus comprising:
a partially transparent screen at least partially disposed within the user's field of vision;
an image generator positioned to display an image on a first side of the screen;
a user interface displayed by the image generator on the screen comprised of at least one element which overlays so as to obscure a portion of the scene naturally viewed through the screen;
a motion capture device positioned near the screen and configured to capture user gestures occurring beyond the screen in the user's field of vision wherein a first user gesture interacts with at least one of the overlaid user interface elements, and a second user gesture interacts with at least one of the overlaid user interface elements; and
a processor in data communication with the image generator and the motion capture device, the processor configured to execute computer executable instructions in response to the first user gesture in concert with the second user gesture.
19. A motion-sensing display apparatus for interpreting a user's motion to control software applications, comprising:
a partially transparent screen at least partially disposed within the user's field of vision;
an image generator in a position to display a user interface on the screen;
a user interface displayed by the image generator on the screen comprised of at least one element which overlays so as to obscure a portion of the scene naturally viewed through the screen;
a motion capture device mounted near the screen and configured to capture a user gesture occurring beyond the screen in the user's field of vision and interacting with at least one of the overlaid user interface elements; and
a processor in data communication with the image generator and the motion capture device, the processor configured to manipulate the user interface in response to the user gesture, and configured to execute computer executable instructions to select a screen element generated by the image generator of the user interface in response to the user extending two fingers in parallel and substantially aligning the parallel extended fingers with the selected element in the user's field of vision.
18. A motion-sensing display apparatus for interpreting a user's motion to control software applications, comprising:
an eyewear article including a substantially transparent lens;
an image generator in a position to display an image on the lens;
a user interface displayed by the image generator on the screen comprised of a plurality of elements which overlay so as to obscure a portion of the scene naturally viewed through the screen;
a motion capture device comprising a camera positioned proximate the screen and configured to capture user gestures made in front of the lens; and
a processor in data communication with the image generator and the motion capture device;
wherein a first user gesture performed by a first hand interacts with at least one of the plurality of elements of the overlaid user interface elements, a second user gesture performed by a second hand and in concert with the first user gesture interacts with a second of the plurality of elements of the overlaid user interface elements, and the processor is configured to execute a first set of computer executable instructions in response to the first user gesture, and a second set of computer executable instructions in response to the second user gesture, and the motion capture device can be configured to distinguish between gestures made by the user and gestures made by a person other than the user.
2. The motion-sensing display apparatus of
3. The motion-sensing display apparatus of
4. The motion-sensing display apparatus of
6. The motion-sensing display apparatus of
7. The motion-sensing display apparatus of
8. The motion-sensing display apparatus of
9. The motion-sensing display apparatus of
10. The motion-sensing display apparatus of
11. The motion-sensing display apparatus of
12. The motion-sensing display apparatus of
13. The motion-sensing display apparatus of
14. The motion-sensing display apparatus of
15. The motion-sensing display apparatus of
16. The motion-sensing display apparatus of
17. The motion-sensing display apparatus of
|
The present disclosure relates generally to motion-sensing display apparatuses. In particular, motion-sensing display apparatuses supported near a user's eye that execute computer executable instructions in response to user gestures are described.
Known motion-sensing display apparatuses are not entirely satisfactory for the range of applications in which they are employed. Some existing display apparatuses are configured to interface with motion-sensing technologies, such as the Microsoft® Kinect® system. Many such apparatuses, however, do not provide a display proximate a user's eye. As a result, these devices are often ill-suited for displaying images that encompass large portions of a user's field of vision. Further, conventional motion sensing technology is generally large and not portable, limiting its use outside of a user's home or during physical activities.
Often, these displays include an opaque backing that prevents a user from seeing beyond the display. As a result, these devices are often unable augment or overlay scenes within the user's field of vision. Other devices may use a camera to capture a scene beyond the display and augment this captured scene on a display. This often results in unnatural and unreliable displays, however.
Certain display apparatuses are mounted near a user's eye with a piece of eyewear. These devices often produce a display that encompass a large portion of a user's field of vision. These devices, however, often include an opaque backing, preventing the ability to augment natural scenes and view user motions or gestures occurring beyond the screen. This, commonly in conjunction with the use of unwieldy accessories, leads to poor motion capture capabilities (if any exist at all).
Some other devices mounted on a piece of eyewear, such as Vuzix® brand eyewear devices, provide a display on a partially transparent screen. However, these devices often lack support for image capturing and motion sensing functionality. By extension, they often lack the ability to execute computer instructions in response to motion or natural images, including the ability to manipulate the display.
Thus, there exists a need for motion-sensing display apparatuses that improve upon and advance the design of known motion-sensing devices and display devices. Examples of new and useful motion-sensing display apparatuses relevant to the needs existing in the field are discussed below.
The present disclosure is directed to motion-sensing display apparatuses supported near a user's eye including partially transparent screens at least partially disposed within the user's field of vision, image generators positioned to display an image on a first side of the screen, motion capture devices positioned near the screen and configured to capture a user gesture occurring beyond the screen in the user's field of vision, and processors in data communication with the image generator and the motion capture device, the processors configured to execute computer executable instructions in response to the user gesture. In some examples, motion-sensing display apparatuses include cameras. In some further examples, image generators display user interfaces on screens.
The disclosed motion-sensing display apparatuses will become better understood through review of the following detailed description in conjunction with the figures. The detailed description and figures provide merely examples of the various inventions described herein. Those skilled in the art will understand that the disclosed examples may be varied, modified, and altered without departing from the scope of the inventions described herein. Many variations are contemplated for different applications and design considerations; however, for the sake of brevity, each and every contemplated variation is not individually described in the following detailed description.
Throughout the following detailed description, examples of various motion-sensing display apparatuses are provided. Related features in the examples may be identical, similar, or dissimilar in different examples. For the sake of brevity, related features will not be redundantly explained in each example. Instead, the use of related feature names will cue the reader that the feature with a related feature name may be similar to the related feature in an example explained previously. Features specific to a given example will be described in that particular example. The reader should understand that a given feature need not be the same or similar to the specific portrayal of a related feature in any given figure or example.
With reference to
As
As
Processor 160 may acquire data from other elements of motion-sensing display apparatus 100 or from external sources and execute computer executable code in response to this data. For example, processor 160 is configured to acquire data from motion capture device 140 such as data that corresponds to a user gesture. Processor 160 may additionally or alternatively acquire data from microphone 150. In some examples, the processor acquires data from a separate device, such as a portable music player, a personal data assistant, a smart phone, a global positioning system, or the like.
Processor 160 is in data communication with image generator 130 and may instruct image generator 130 to generate and manipulate a display projected on lens 120. As
As
As
As
As
As
As
As
For example,
As gesture 192 illustrates, motion-sensing display apparatus 100 is capable of distinguishing a user from the remainder of a captured scene. For example, motion-sensing display apparatus 100 may distinguish between different users, such as when a person other than the user performs a second user gesture in the view of motion capture device 140.
Processor 160 may additionally store captured data to learn a user's features. In certain examples, users may verify processor 160's user differentiating functionality to train processor 160 to better differentiate users.
As
As
With reference to
As
Motion capture accessory 299 is configured to augment motion capture device 240 in capturing the gesture. Motion capture accessory 299 defines a glove in data communication with processor 260. As
As
Accessories may additionally include elements that allow motion capture device 240 to track its position more accurately. Such elements may be bodies attached to the exterior of the accessories. Accessories may additionally or alternatively be constructed of a material selected for increased compatibility with motion capture device 240. Such accessories may assist motion capture device 240 in capturing the movement, location, and shape of the relevant parts of the user or of the accessory without requiring a data connection with processor 160.
The disclosure above encompasses multiple distinct inventions with independent utility. While each of these inventions has been disclosed in a particular form, the specific embodiments disclosed and illustrated above are not to be considered in a limiting sense as numerous variations are possible. The subject matter of the inventions includes all novel and non-obvious combinations and subcombinations of the various elements, features, functions and/or properties disclosed above and inherent to those skilled in the art pertaining to such inventions. Where the disclosure or subsequently filed claims recite “a” element, “a first” element, or any such equivalent term, the disclosure or claims should be understood to incorporate one or more such elements, neither requiring nor excluding two or more such elements.
Applicant(s) reserves the right to submit claims directed to combinations and subcombinations of the disclosed inventions that are believed to be novel and non-obvious. Inventions embodied in other combinations and subcombinations of features, functions, elements and/or properties may be claimed through amendment of those claims or presentation of new claims in the present application or in a related application. Such amended or new claims, whether they are directed to the same invention or a different invention and whether they are different, broader, narrower or equal in scope to the original claims, are to be considered within the subject matter of the inventions described herein.
Patent | Priority | Assignee | Title |
10067341, | Feb 04 2014 | Intelligent Technologies International, Inc | Enhanced heads-up display system |
10642348, | Dec 11 2017 | KYOCERA Document Solutions Inc. | Display device and image display method |
11087544, | Apr 23 2018 | AbdurRahman Bin Shahzad, Bhatti; Jensen Rarig Steven, Turner; Xander E., Fries | Augmented reality system for fitness |
11734959, | Mar 16 2021 | Snap Inc. | Activating hands-free mode on mirroring device |
11798201, | Mar 16 2021 | Snap Inc. | Mirroring device with whole-body outfits |
11809633, | Mar 16 2021 | Snap Inc. | Mirroring device with pointing based navigation |
11908243, | Mar 16 2021 | Snap Inc. | Menu hierarchy navigation on electronic mirroring devices |
11967027, | Apr 23 2018 | HOLOPACER, INC ; PELOTON INTERACTIVE, INC | Augmented reality system for fitness |
11978283, | Mar 16 2021 | Snap Inc. | Mirroring device with a hands-free mode |
9454006, | Feb 28 2012 | Seiko Epson Corporation | Head mounted display and image display system |
9560272, | Mar 24 2014 | Samsung Electronics Co., Ltd. | Electronic device and method for image data processing |
9696551, | Aug 13 2014 | Beijing Lenovo Software Ltd.; Lenovo (Beijing) Limited | Information processing method and electronic device |
Patent | Priority | Assignee | Title |
7920102, | Dec 15 1999 | AMERICAN VEHICULAR SCIENCES LLC | Vehicular heads-up display system |
8179604, | Jul 13 2011 | GOOGLE LLC | Wearable marker for passive interaction |
8203502, | May 25 2011 | GOOGLE LLC | Wearable heads-up display with integrated finger-tracking input sensor |
8558759, | Jul 08 2011 | GOOGLE LLC | Hand gestures to signify what is important |
20070075919, | |||
20100066676, | |||
20110213664, | |||
20110214082, | |||
20110221656, | |||
20110221657, | |||
20110221658, | |||
20110221659, | |||
20110221668, | |||
20110221669, | |||
20110221670, | |||
20110221671, | |||
20110221672, | |||
20110221793, | |||
20110221896, | |||
20110221897, | |||
20110222745, | |||
20110225536, | |||
20110227812, | |||
20110227813, | |||
20110227820, | |||
20110231757, | |||
20120062445, | |||
20120075168, | |||
20120235899, | |||
20120262574, | |||
20120293408, | |||
20120293544, | |||
20120309535, | |||
20120309536, | |||
20130050069, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 04 2012 | FINK, RYAN | ONTHEGOPLATFORMS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028402 | /0573 | |
Nov 17 2015 | ONTHEGO PLATFORMS, INC | ATHEER LABS, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 039569 | /0518 | |
Nov 17 2015 | ONTHEGO PLATFORMS, INC | ATHEER, INC | CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE NAME PREVIOUSLY RECORDED ON REEL 039569 FRAME 0518 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT | 058706 | /0799 | |
Nov 19 2018 | ATHEER, INC | COTA CAPITAL MASTER FUND, L P , AS COLLATERAL AGENT | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 048154 | /0759 | |
Jan 10 2022 | FINK, RYAN | ATHEER, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 058637 | /0365 | |
Jan 30 2022 | ATHEER, INC | WEST TEXAS TECHNOLOGY PARTNERS, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 058962 | /0067 |
Date | Maintenance Fee Events |
Feb 14 2019 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Feb 20 2023 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Date | Maintenance Schedule |
Aug 18 2018 | 4 years fee payment window open |
Feb 18 2019 | 6 months grace period start (w surcharge) |
Aug 18 2019 | patent expiry (for year 4) |
Aug 18 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 18 2022 | 8 years fee payment window open |
Feb 18 2023 | 6 months grace period start (w surcharge) |
Aug 18 2023 | patent expiry (for year 8) |
Aug 18 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 18 2026 | 12 years fee payment window open |
Feb 18 2027 | 6 months grace period start (w surcharge) |
Aug 18 2027 | patent expiry (for year 12) |
Aug 18 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |