An interactive three-dimensional display system includes a three-dimensional display panel which has an optical sensor array, an interactive device which includes a projection light source and a shadow mask, and an image recognizing unit. The shadow mask has a pattern to define an image projected by the interactive device. The image is captured by the optical sensor array. The pattern includes two strip patterns which cross each other. The image includes two strip images which cross each other. The image recognizing unit is electrically connected with the optical sensor array and calculates relative positions of the interactive device and the three-dimensional display panel according to the image. A method of calculating the relative positions includes calculating according to the lengths of one of the strip patterns and one of the strip images, and a divergent angle and tilt angle of the projection light source.
|
9. A method of calculating a distance, suitable for calculating a shortest distance between an interactive device and an optical sensor array, wherein the interactive device comprises a projection light source and a shadow mask, the shadow mask has a pattern which defines an image projected by the interactive device, the image is captured by the optical sensor array, the pattern comprises two strip patterns which cross each other, and the image comprises two strip images which cross each other, the method of calculating the distance comprising:
calculating the shortest distance between the interactive device and the optical sensor array according to a length of one of the strip patterns, a length of only one of the strip images, a divergent angle of the projection light source, and a tilt angle of the projection light source.
1. An interactive three-dimensional (3D) display system, comprising:
a three-dimensional display panel having an optical sensor array;
an interactive device comprising a projection light source and a shadow mask, wherein the shadow mask has a pattern which defines an image projected by the interactive device, the image being captured by the optical sensor array, the pattern comprising two strip patterns which cross each other, and the image comprising two strip images which cross each other; and
an image recognizing unit electrically connected with the optical sensor array, wherein the image recognizing unit calculates relative positions of the interactive device and the three-dimensional display panel according to the image captured by the optical sensor array, a method of calculating the relative positions of the interactive device and the three-dimensional display panel comprises: calculating the relative positions of the interactive device and the three-dimensional display panel according to a length of one of the strip patterns, a length of only one of the strip images, a divergent angle of the projection light source, and a tilt angle of the projection light source.
2. The interactive three-dimensional display system as claimed in
3. The interactive three-dimensional display system as claimed in
4. The interactive three-dimensional display system as claimed in
5. The interactive three-dimensional display system as claimed in
6. The interactive three-dimensional display system as claimed in
7. The interactive three-dimensional display system as claimed in
8. The interactive three-dimensional display system as claimed in
10. The method of calculating the distance as claimed in
12. The method of calculating the distance as claimed in
WS=2×(h secθ×cot ψ+x). |
This application claims the priority benefit of Taiwan application serial no. 99108357, filed on Mar. 22, 2010. The entirety of the above-mentioned patent application is hereby incorporated by reference herein and made a part of this specification.
1. Field of the Invention
The invention is related to a three-dimensional (3D) display system, and in particular to an interactive three-dimensional display system.
2. Description of Related Art
In recent years, as display technology advances, users have become more and more demanding on display quality (such as image resolution and color saturation). However, besides high resolution and high color saturation, in order to satisfy the need of users to view realistic images, displays which are capable of displaying three-dimensional images have been developed. As current three-dimensional display technology continues to advance, it is foreseeable that real-time interaction between the user and three-dimensional images will become the trend in next generation human-machine interaction.
Currently, a three-dimensional interactive display device interacts with user by capturing the three-dimensional spatial position of the user. During actual operation, current interactive display devices sense signals input by the user, thereby tracking the positions of fingers by using a tracking device externally installed on the three-dimensional display. In conventional art, the relative positions of the tracking device and the three-dimensional display affect the sensing range of the tracking device, so that when the user clicks the three-dimensional image displayed by the three-dimensional display at some positions or click at an oblique angle, the tracking device is difficult to sense the signal input by the user; misjudgment may even occur, thereby reducing the interactive sensitivity. Moreover, the additionally installed tracking device also increases the volume of the three-dimensional display, thereby causing inconvenience in spatial arrangement.
The invention provides an interactive three-dimensional display system which has good interactive sensitivity.
The invention provides a method of calculating a distance, which is suitable for being used in an interactive three-dimensional display system to calculate the relative positions of an interactive device and a three-dimensional display panel.
The invention provides an interactive three-dimensional display system which includes a three-dimensional display panel, an interactive device, and an image recognizing unit. The three-dimensional display panel has an optical sensor array. The interactive device includes a projection light source and a shadow mask. The shadow mask has a pattern to define an image projected by the interactive device. The image is captured by the optical sensor array, wherein the pattern includes two strip patterns which cross each other, and the image includes two strip images which cross each other. In addition, the image recognizing unit is electrically connected with the optical sensor array and calculates relative positions of the interactive device and the three-dimensional display panel according to the image captured by the in-cell type sensor array. The method of calculating the relative positions of the interactive device and the three-dimensional display panel includes calculating the relative positions of the interactive device and the three-dimensional display panel according to the length of one of the strip patterns, the length of one of the strip images, a divergent angle of the projection light source, and a tilt angle of the projection light source.
According to an embodiment of the invention, the pattern has two strip shading patterns which cross each other, and the strip shading patterns correspond to the strip images which cross each other.
According to an embodiment of the invention, the lengths of the strip shading patterns are substantially equal.
According to an embodiment of the invention, the strip shading patterns extend in directions which are substantially perpendicular to each other.
According to an embodiment of the invention, the pattern has two hollowed strip patterns which cross each other, and the hollowed strip patterns correspond to the strip images which cross each other.
According to an embodiment of the invention, the lengths of the hollowed strip patterns are substantially equal.
According to an embodiment of the invention, the hollowed strip patterns extend in directions which are substantially perpendicular to each other.
According to an embodiment of the invention, the projection light source includes a light emitting diode.
In addition, the invention further provides a method of calculating a distance which is suitable for calculating the shortest distance between an interactive device and an optical sensor array. The interactive device includes a projection light source and a shadow mask. The shadow mask has a pattern for defining an image projected by the interactive device, and the image is captured by the optical sensor array, wherein the pattern includes two strip patterns which cross each other, and the image includes two strip images which cross each other. The method of calculating the distance includes calculating the shortest distance between the interactive device and the optical sensor array according to the length of one of the strip patterns, the length of one the strip images, the divergent angle of the projection light source, and the tilt angle of the projection light source.
According to an embodiment of the invention, the length of each of the strip patterns is x, wherein the length of the longer one of the strip images is WL, the divergent angle of the projection light source is ψ, the tilt angle of the projection light source is θ, the shortest distance between the interactive device and the optical sensor array is h, and x, WL, ψ, θ, and h comply with the following equation:
According to an embodiment of the invention, 0°≦θ≦40°.
According to an embodiment of the invention, the length of each of the strip patterns is x, wherein the length of the shorter one of the strip images is WS, the divergent angle of the projection light source is ψ, the tilt angle of the projection light source is θ, the shortest distance between the interactive device and the optical sensor array is h, and x, WS, ψ, θ, and h comply with the following equation:
WS=2×(h secθ×cot ψ+x)
In summary, in embodiments of the invention, the optical sensor array is adopted and the strip patterns are designed on the interactive device, so that according to the changes of the lengths of the images corresponding to the tilt angle θ and the minimum distance h, the relative positions of the interactive device and the three-dimensional display panel are calculated, so that false actions caused by the user clicking the three-dimensional display panel at a tilt angle are prevented, thereby enhancing the interactive sensitivity of the interactive three-dimensional control display system.
In order to make the aforementioned and other objects, features and advantages of the invention comprehensible, embodiments accompanied with figures are described in detail below.
The accompanying drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
First Embodiment
Still referring to
In further detail, the strip patterns 1242a and 1242b are two strip shading patterns which cross each other, and the strip shading patterns 1242a and 1242b correspond to the strip images 1232a and 1232b which cross each other. The shading patterns 1242a and 1242b which cross each other are a dark area. In addition, a length x of the strip shading pattern 1242a and the length x of the strip shading pattern 1242b are substantially equal and extend in directions which are substantially perpendicular to each other. It should be noted that according to other embodiments, the lengths of the strip shading patterns 1242a and 1242b are not necessarily perpendicular to each other.
On the other hand, by using the optical sensor array 1110 to capture the above image 1232, the image recognizing unit 1300 is able to calculate the relative positions of the interactive device 1200 and the three-dimensional display panel 1100 according to the image 1232 captured by the optical sensor array 1110. According to the present embodiment, the image recognizing unit 1300 is capable of for example, calculating a shortest distance between the interactive device 1200 and the optical sensor array 1110. The method of calculating the relative positions of the interactive device 1200 and the three-dimensional display panel 1100 by the image recognizing unit 1300 includes calculating the relative positions of the interactive device 1200 and the three-dimensional display panel according to the length of the strip pattern 1242a or 1242b, a length WS or WL of the strip image 1232a or 1232b, a divergent angle ψ of the projection light source 1220, and a tilt angle θ of the projection light source 120. Relevant calculation methods will be further described.
As shown in
For convenience of further description, the length of the longer strip image 1232b is defined as WL, and the length of the shorter strip image 1232a is defined as WS. In addition, the shortest distance between the interactive device 1200 and the optical sensor array 1110 is defined as h. According to the present embodiment, 0°≦θ≦40°, and the set of x, WL, ψ, θ, and h and the set of x, WS, ψ, θ, and h respectively complies with the following equations.
As known from the above equations, the lengths WL and WS of the strip images 1232a and 1232b change with the tilt angle θ and the distance h in a linear fashion. Hence, the recognizing unit 1300 in
Second Embodiment
In addition, the length of the hollowed strip pattern 2242a and the length of the hollowed strip pattern 2242b are substantially equal and extend in directions which are substantially perpendicular to each other. The above pattern 2242 defines an image 2232 projected by the interactive device, wherein the strip images 2232a and 2232b are a bright area. It should be noted that according to other embodiments, the lengths of the hollowed strip patterns 2242a and 2242b are not necessarily perpendicular to each other.
Please refer to both
In summary, in embodiments of the invention, the optical sensor array is adopted and the strip patterns are designed on the interactive device, so that according to the changes of the lengths of the images corresponding to the tilt angle θ and the minimum distance h, the relative positions of the interactive device and the three-dimensional display panel are calculated, so that false actions caused by the user clicking the three-dimensional display panel at a tilt angle are prevented, thereby enhancing the interactive sensitivity of the interactive three-dimensional control display system.
It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the invention without departing from the scope or spirit of the invention. In view of the foregoing, it is intended that the invention cover modifications and variations of this invention provided they fall within the scope of the following claims and their equivalents.
Wang, Guo-Zhen, Pai, Cheng-Chiu, Huang, Yi-Pai, Wu, Pi-Cheng
Patent | Priority | Assignee | Title |
10176553, | Jun 26 2015 | Sony Corporation | Image processing system with three-dimensional viewing and method of operation thereof |
10725551, | Jun 24 2015 | BOE TECHNOLOGY GROUP CO., LTD.; Beijing Boe Optoelectronics Technology Co., Ltd. | Three-dimensional touch sensing method, three-dimensional display device and wearable device |
Patent | Priority | Assignee | Title |
5640273, | Mar 28 1994 | Vision3D Technologies, LLC | Three-dimensional display panel and three-dimensional display using the same |
5959617, | Aug 10 1995 | U.S. Philips Corporation | Light pen input systems |
6727885, | Sep 07 1999 | Nikon Corporation; NIKON TECHNOLOGIES INC | Graphical user interface and position or attitude detector |
7499027, | Apr 29 2005 | Microsoft Technology Licensing, LLC | Using a light pointer for input on an interactive display surface |
8531458, | Dec 18 2009 | AU Optronics Corp. | Method of determining pointing object position for three-dimensional interactive system |
8531506, | Feb 24 2011 | AU Optronics Corporation | Interactive stereo display system and method for calculating three-dimensional coordinate |
20050110781, | |||
20050195387, | |||
20050237297, | |||
20060152487, | |||
20060152489, | |||
20060244719, | |||
20060284841, | |||
20070216644, | |||
20120229384, | |||
20130155057, | |||
20130169596, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 27 2010 | HUANG, YI-PAI | AU Optronics Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 024656 | /0217 | |
Jun 27 2010 | WU, PI-CHENG | AU Optronics Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 024656 | /0217 | |
Jun 27 2010 | WANG, GUO-ZHEN | AU Optronics Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 024656 | /0217 | |
Jun 27 2010 | PAI, CHENG-CHIU | AU Optronics Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 024656 | /0217 | |
Jul 09 2010 | AU Optronics Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Nov 30 2017 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Dec 01 2021 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Jun 17 2017 | 4 years fee payment window open |
Dec 17 2017 | 6 months grace period start (w surcharge) |
Jun 17 2018 | patent expiry (for year 4) |
Jun 17 2020 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 17 2021 | 8 years fee payment window open |
Dec 17 2021 | 6 months grace period start (w surcharge) |
Jun 17 2022 | patent expiry (for year 8) |
Jun 17 2024 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 17 2025 | 12 years fee payment window open |
Dec 17 2025 | 6 months grace period start (w surcharge) |
Jun 17 2026 | patent expiry (for year 12) |
Jun 17 2028 | 2 years to revive unintentionally abandoned end. (for year 12) |