A method and system for displaying information on a transparent display device includes receiving a touch input for selecting an object displayed on a screen of an external device that is viewable through a screen of the transparent display device, requesting the external device for information related to the object, receiving the information related to the object from the external device, and displaying the received information on the screen of the transparent display device.
|
11. A transparent display device comprising:
a transparent display configured to receive a touch input indicating first position information of an external device that is viewable through the transparent display, and to receive a second touch input indicating second position information of an object displayed on a screen of the external device viewable through the transparent display;
a communication unit configured to communicate with the external device; and
a processor configured to determine first coordinates of the external device on the transparent display device as seen through the transparent display device based on the first touch input, determine second coordinates of the object displayed on the screen of the external device viewable through the screen of the transparent display device based on the second touch input, determine relational information between the first coordinates and the second coordinates, request the external device for information related to the object based on the first coordinates, the second coordinates, and the relational information, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and
display the received information on the transparent display.
1. A method of displaying information on a transparent display device, the method comprising:
receiving a first touch input on a screen of the transparent display device indicating first position information of an external device that is viewable through the screen of the transparent display device;
determining first coordinates of the external device on the transparent display device as seen through the transparent display device based on the first touch input;
receiving a second touch input on the screen of the transparent display device indicating second position information of an object displayed on a screen of the external device viewable through the screen of the transparent display device;
determining second coordinates of the object displayed on the screen of the external device viewable through the screen of the transparent display device based on the second touch input;
determining relationship information between the first coordinates and the second coordinates;
requesting the external device for information related to the object based on the first coordinates, the second coordinates, and the relational information;
receiving information related to the object from the external device in response to the requesting; and
displaying the received information on the screen of the transparent display device.
20. A non-transitory computer-readable recording medium having embodied thereon a program which when executed implements a method of displaying information to be performed by a transparent display device, the method comprises:
receiving a first touch input on a screen of the transparent display device indicating first position information of an external device that is viewable through the screen of the transparent display device;
determining first coordinates of the external device on the transparent display device as seen through the transparent display device based on the first touch input;
receiving a second touch input on the screen of the transparent display device indicating second position information of an object displayed on a screen of the external device viewable through the screen of the transparent display device;
determining second coordinates of the object displayed on the screen of the external device viewable through the screen of the transparent display device based on the second touch input;
determining relationship information between the first coordinates and the second coordinates;
requesting the external device for information related to the object based on the first coordinates, the second coordinates, and the relational information;
receiving information related to the object from the external device in response to the requesting; and
displaying the received information on the screen of the transparent display device.
2. The method of
3. The method of
4. The method of
5. The method of
wherein the screen information is included in a selectable screen information menu item about the external device, which is displayed on the screen of the transparent display device,
wherein the screen information comprises at least one of screen size information and screen type information.
6. The method of
7. The method of
8. The method of
9. The method of
a display location on the screen of the external device of the at least one other object differs from that of the object.
10. The method of
12. The transparent display device of
13. The transparent display device of
14. The transparent display device of
15. The transparent display device of
wherein the screen information is included in a selectable screen information menu item about the external device, which is displayed on the transparent display,
wherein the screen information comprises at least one of screen size information and screen type information,
wherein the screen information menu item is stored in the storage unit,
wherein the screen information menu item is read by the processor from the storage unit and displayed on the transparent display.
16. The transparent display device of
17. The transparent display device of
18. The transparent display device of
wherein a display location on the screen of the external device of the at least one other object differs from that of the object.
19. The transparent display device of
|
This application is a continuation application of U.S. application Ser. No. 14/031,483, filed Sep. 19, 2013, in the U.S. Patent and Trademark Office, which claims priority from Korean Patent Application No. 10-2012-00104156, filed on Sep. 19, 2012, and Korean Patent Application No. 10-2013-00106227, filed on Sep. 4, 2013, in the Korean Intellectual Property Office, the disclosure of which is incorporated herein by reference in its entirety.
Methods and apparatuses consistent with exemplary embodiments relate to displaying information, and more particularly, to a system and method for displaying information related to an external object or an external device on a transparent display device.
Transparent display devices are considered next generation display devices. A transparent display device has a degree of transparency that enables a user to see an external object or an external device through the transparent display device.
However, a transparent display device does not display information related to the external object or the external device.
Exemplary embodiments provide a system, a method, and an apparatus for displaying information related to an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an object displayed on a screen of an external device seen through a screen of a transparent display device on the screen of the transparent display device, and a recording medium thereof.
Exemplary embodiments also provide a system, a method, and an apparatus for displaying information related to an external object seen through a screen of a transparent display device on the screen of the transparent display device, and a recoding medium thereof.
According to an aspect of an exemplary embodiment, there is provided a method of displaying information on a transparent display device, the method including: receiving a touch input on the transparent display device that selects an object displayed on an external device that is viewable through a screen of the transparent display device; requesting the external device for information related to the object; receiving the information related to the object from the external device; and displaying the received information on the screen of the transparent display device.
The touch input may indicate a contour line of the object that is viewable through the screen, a tap-based touch indicating a location on the screen at which the object is viewable through the screen, or indicate a closed region on the screen at which the object is viewable through the screen.
The information related to the object indicates at least one other object having a type that is the same as a type of the object, and a display location on a screen of the external device of the at least one other object differs from that of the object.
The information related to the object indicates information that is not displayed on a screen of the external device.
The displaying comprises displaying the received information at a display location on the screen of the transparent display device that corresponds to a display location of the object on a screen of the external device.
The method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
The method may further include displaying information related to the external device based on an augmented reality service on the screen of the transparent display device.
The requesting and the receiving the information are performed based on one of a direct communication between devices, a communication via a server, and a communication via a repeater.
According to another aspect of an exemplary embodiment, there is provided a transparent display device including: a transparent display configured to receive a touch input that selects an object displayed on an external device that is viewable through the transparent display; a communication unit configured to communicate with an external device that is viewable through the transparent display; and a processor configured to request the external device for information related to the object based on the touch input, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and control the transparent display to display the received information.
According to another aspect of an exemplary embodiment, there is provided a method of displaying information on a transparent display device, the method including: receiving a first touch input on a screen of the transparent display device indicating first position information of an external device that is viewable through the screen of the transparent display device and receiving a second touch input on the screen of the transparent display device indicating second position information of an object displayed on a screen of the external device viewable through the screen of the transparent display device; requesting the external device for information related to the object based on the first position information and the second position information; receiving information related to the object from the external device in response to the requesting; and displaying the received information on the screen of the transparent display device.
The first position information indicates a contour line of the external device viewable through the screen of the transparent display device.
The first touch input may be independent touch operations on a first point and a second point on the screen of the transparent display device that indicate a contour line of the external device that is viewable through the screen of the transparent display device.
The first touch input may be a touch-and-drag operation for connecting a first point and a second point on the screen of the transparent display device that indicates a contour line of the external device that is viewable through the screen of the transparent display device.
The first touch input may indicate a touch-based region adjusting operation for guide information displayed on the screen of the transparent display device, and a range related to the touch-based region adjusting operation for the guide information may be based on a contour line of the external device that is viewable through the screen of the transparent display device.
The first touch input may be a touch operation for selecting screen information of the external device, wherein the screen information may be included in a selectable screen information menu item about the external device, which is displayed on the screen of the transparent display device, and the screen information may include at least one of screen size information and screen type information.
The second position information may indicate a contour line of the object that is viewable through the screen on the transparent display device.
The second touch input may be a tap-based touch indicating a location on the screen of the transparent display device at which the object is viewable through the screen of the transparent display device.
The second touch input may indicate a closed region on the screen of the transparent display device through which the object is viewable on the screen of the transparent display device.
The information related to the object may indicate at least one other object having a type that is the same as a type of the object, and a display location on the screen of the external device of the at least one other object may differ from that of the object.
The method may further include editing the information that is displayed on the screen of the transparent display device based on an interaction between the transparent display device and the external device.
According to another aspect of an exemplary embodiment, there is provided a transparent display device including: a transparent display configured to receive a touch input indicating first position information of an external device that is viewable through the transparent display, and to receive a second touch input indicating second position information of an object displayed on a screen of the external device viewable through the transparent display; a communication unit configured to communicate with the external device; and a processor configured to request the external device for information related to the object based on the first position information and the second position information, via the communication unit, receive information related to the object from the external device in response to the request, via the communication unit, and display the received information on the transparent display.
According to another aspect of an exemplary embodiment, there is provided a method of displaying information on a screen of a transparent display device, the method including: receiving from the transparent display device a request for information related to at least one object displayed on the screen of an external device that is viewable through a screen of the transparent display device; selecting the at least one object in response to the request; and transmitting the information related to the selected object to the transparent display device, wherein the request for information related to the object comprises first position information of the external device indicated by a first touch input on the transparent display device and second position information of the object displayed on the screen of the external device indicated by a second touch input on the transparent display device.
According to another aspect of an exemplary embodiment, there is provided a non-transitory computer-readable recording medium having embodied thereon a program for implementing the methods discussed of displaying information on the transparent display device.
The above and other aspects will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings in which:
As the exemplary embodiments allow for various changes and numerous embodiments, particular exemplary embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the exemplary embodiments to particular modes of practice, and it is to be appreciated that all changes, equivalents, and substitutes that do not depart from the spirit and technical scope of the disclosure are encompassed. In the description, certain explanations of well known related art are omitted.
While such terms as “first,” “second,” etc., may be used to describe various components, such components must not be limited to the above terms. The above terms are used only to distinguish one component from another.
The terms used in the present specification are merely used to describe particular exemplary embodiments, and are not intended as limiting. All terms including descriptive or technical terms used herein should be construed as having meanings that would be understood to one of ordinary skill in the art. However, the terms may have different meanings according to an intention of one of ordinary skill in the art, precedent cases, or the appearance of new technologies. Also, some terms may be arbitrarily selected by the applicant, and in this case, the meaning of the selected terms will be described in detail in the detailed description. Thus, the terms used herein have to be defined based on the meaning of the terms together with the description throughout the specification. Screens suggested in the present application are used only for descriptive purposes, and are not intended as limiting.
An expression used in the singular encompasses the expression of the plural, unless it has a clearly different meaning in the context. In the present specification, it is to be understood that the terms such as “including” or “having,” etc., are intended to indicate the existence of the features, numbers, steps, actions, components, parts, or combinations thereof disclosed in the specification, and are not intended to preclude the possibility that one or more other features, numbers, steps, actions, components, parts, or combinations thereof may exist or may be added.
As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. Expressions such as “at least one of,” when preceding a list of elements, modify the entire list of elements and do not modify the individual elements of the list.
An object denotes a component or information displayed on an external device or a screen of the external device of a transparent display device. For example, an object may include an image, an image included in another image, an icon, a folder icon, an icon included in a folder icon, text, a pop-up window, an application execution window, a content included in an application execution window, a list, an item, a content, and a file included in a list; however, the present invention is not limited thereto. Examples of an object will be described in detail in various examples of screens that will be described later. The object may be referred to as an external object of the transparent display device.
Throughout the entire specification, a touch input denotes input information of a user input through a touch-based gesture using a finger of the user or a touch tool. The touch tool may be referred to as an external input device, a stylus, or a stylus pen.
The touch-based gesture may be variously defined. In other words, examples of the touch-based gesture may include touch-based motions on a touch screen, such as tap, touch-and-hold, double tap, drag, touch-and-drag, panning, flick, drag-and-drop, sweep, and swipe, but the touch-based gesture is not limited thereto.
The touch input may be replaced by a gesture based on an image captured by a camera, according to an input desired to represent based on the touch. For example, if the touch input is an input for selecting an object displayed on an external device, the touch input may be replaced by a gesture or operation according to a moving direction or sign of the hand captured by the camera. The camera may be configured based on an image sensor or an optical sensor.
The touch input may be replaced by a user voice signal based on natural language, according to an input desired to represent based on the touch. For example, if a touch input is an input for selecting an object including a certain letter or a name displayed on an external device, the touch input may be replaced by a user voice signal based on natural language representing the certain letter or the name of the object.
Hereinafter, exemplary embodiments will be described in detail with reference to accompanying drawings, wherein like reference numerals denote like elements to not provide repeated descriptions.
Referring to
For example, as shown in
The information display system may further include an access point 130, as shown in
When the information display system is configured as shown in
That is, the transparent display device 100 and the external device 110 may be connected to each other via a wire. For example, the transparent display device 100 and the external device 110 may be connected to each other via a universal serial bus (USB) or a universal asynchronous receiver/transmitter (UART) to transmit/receive data. The device-to-device direct communication method may be referred to as a machine-to-machine (M2M) communication method, a device-to-device (D2D) communication method, or a peer-to-peer (P2P) communication method.
Therefore, the communication between the transparent display device 100 and the external device 110 may be performed based on one of the direct communication between devices, the communication method via the access point 130, and the communication method via the server 120, according to elements of the information display system, but is not limited thereto.
The transparent display device 100 and the external device 110 may transmit and/or receive at least one of size information thereof, owner information thereof, and information sharable with other devices, through a short distance communication method such as a near field communication (NFC).
The size information of the device may be represented as, for example, (width×length×thickness) mm, but is not limited thereto. Screen information may include screen size information and screen type information, and is not limited thereto. The screen size information may be represented as, for example, A4, B5, 7 inches, or 5.5 inches, and is not limited thereto. The screen type information may represent whether the screen is a touch screen or a non-touch screen, and is not limited thereto. For example, the screen type information may represent whether the screen is a liquid crystal display (LCD) panel or an active matrix organic light emitting diodes (AMO LED) panel.
The transparent display device 100 may display the information about the external device 110, which is transmitted from the external device 110 via a short distance communication method, such as the NFC, as information about the external device 110 based on the augmented reality service. For example, the transparent display device 100 may display the information about the external device 110 on a display area adjacent to the external device 110 that is seen through the transparent display device 100. The display area is a part of a screen of the transparent display device 100. The external device 100 that is seen through the transparent display device 100 may be referred to as the external device 100 that is seen via the screen of the transparent display device 100.
The transparent display device 100 is a device having a transparent display. For example, the transparent display device 100 may be a mobile phone having a transparent display, a smartphone having a transparent display, a notebook computer having a transparent display, a tablet PC having a transparent display, a handheld PC having a transparent display, an electronic book terminal having a transparent display, a digital broadcasting terminal having a transparent display, a personal digital assistant (PDA) having a transparent display, a portable multimedia player (PMP) having a transparent display, a navigation device having a transparent display, a smart TV having a transparent display, a consumer electronic (CE) device having a transparent display (for example, a refrigerator having a transparent display, an air conditioner having a transparent display, a dish washing machine having a transparent display, etc.), and an iOS-convertible device having a transparent display, but is not limited thereto. The transparent display may be applied to various fields such as high added-value glass, glass as a functional car element, car dashboard, navigators, security electronic devices, solar batteries, electronic devices for military, game consoles, toys, and show windows, as well as smart windows. The screen of the transparent display device 100 may be referred to as a screen on the transparent display.
The transparent display device 100 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto.
The transparent display device 100 requests information related to at least one object that is displayed on the external device 110 and seen through the transparent display device 100, based on a touch input. When receiving the information related to the object from the external device 110, the transparent display device 100 displays the received information.
The external device 110 is a device that is seen through the transparent display device 100, through the screen of the transparent display device 100, or through the transparent display of the transparent display device 100. The external device 110 may be referred to as another device. The external device 110 may not include a transparent display. For example, the external device 110 may be a mobile phone, a smartphone, a notebook computer, a tablet PC, a handheld PC, an electronic book terminal, a digital broadcasting terminal, a PDA, a PMP, a navigation, a smart TV, a CE device (for example, a refrigerator, an air conditioner, a dishwashing machine having a display panel, etc.), and an iOS convertible device, but is not limited thereto. That is, the external device 110 may include a transparent display.
The external device 110 may provide application execution function, communication function, media player function, web-browsing function, word-processing function, e-mail transmission function, messenger function, and/or data storage function, but is not limited thereto.
When the transparent display device 100 requests the information related to at least one object that is displayed, the external device 110 selects the requested object and transmits information related to the requested object to the transparent display device 100.
In operation S201, the transparent display device 100 receives a first touch input and a second touch input. The first touch input represents reference information of the external device 110 that is seen through the transparent display device 100. The reference information is used to detect a display location of the object on the external device 110, wherein the object is selected by the second touch input in the transparent display device 100. The reference information may be referred to as first position information of the external device 110.
In
In
Referring to
Referring to
Referring to
The transparent display device 100 may display an arrow or a block setting shown in
In
The guide information may be, for example, camera focusing range information. The guide information may be displayed according to a request of a user of the transparent display device 100. For example, the request of the user may include a request for displaying guide information for executing the information display method according to the exemplary embodiment, or request for executing the information display method.
As shown in
Accordingly, the transparent display device 100 may trace the contour line of the external device 110 according to adjusted (x, y) coordinate values of the four points P5, P6, P7, and P8 of the guide information G1. The tracing of the contour line may be performed by connecting the changed (x, y) coordinate values of the points P5, P6, P7, and P8, but is not limited thereto. The changed (x, y) coordinate value of each point may be obtained by adding a variation amount according to the dragging operation to the original (x, y) coordinate value, but is not limited thereto. That is, according to two-touch operations, that is, touching the four points P5, P6, P7, and P8 of the guide information G1 and touching the points representing the desired region, the original (x, y) coordinate values of the points may be updated to the (x, y) coordinate values of the second touched points.
Next, after touching the right lowermost point P8 of the guide information G1, the touched point is dragged to the right lowermost point in the contour line of the external device 110, the region of the guide information G1 is moved from the points P6, P7, and P8 except for the left uppermost point P5, and accordingly, the display state of the guide information G1 is changed as shown in
According to the examples shown in
The one point and the diagonal point in the guide information G1 are not limited to the examples shown in
As shown in
Meanwhile, when the external device 110 is seen through the transparent display device 100 as shown in
Also, the first touch input may be based on a touch operation for selecting screen information of the external device 110, which is included in a menu 910 shown in
The screen size information may represent, for example, whether the screen size of the transparent display device 100 is equal to a screen size of the external device, or certain size information such as A4, B5, 7 inches, 4 inches, etc. as shown in
If the screen size of the transparent display device 100 is different from the screen size of the external device 110, the transparent display device 100 may change the (x, y) coordinate information on the transparent display device 100 according to the first touch input and the (x, y) coordinate information on the transparent display device 100 according to the second touch input into information according to the screen size of the external device 110.
For example, when the transparent display device 100 has a screen size (length, width, area, etc.) of 4 inches and the external device 110 has a screen of 7 inches, the transparent display device 100 may change the coordinate information of the first touch input on the transparent display device 100 and the coordinate information of the second touch input on the transparent display device into coordinate information on the screen size of 7 inches, by using a function of converting the coordinate information of the screen size of 4 inches into coordinate information of the screen size of 7 inches. Here, the transparent display device 100 may use relational information between the (x, y) coordinate information on the transparent display device 100 according to the first touch input and the (x, y) coordinate information on the transparent display device 100 according to the second touch input (for example, difference information between the coordinate information).
Also, if the transparent display device 100 has a screen size of 10 inches and the external device 110 has a screen size of 4 inches, the transparent display device 100 may change the coordinate information of the first touch input on the transparent display device 100 and the coordinate information of the second touch input on the transparent display device 100 into coordinate information on the screen size of 4 inches, by using a function of converting the coordinate information of the 10-inch screen size into the coordinate information of 4-inch screen size.
The above described function of converting the coordinate information according to the screen size may be included in the external device 110. When the external device 110 has the function of converting the coordinate information, the transparent display device 100 may transmit the (x, y) coordinate information on the transparent display device 100 according to the first touch input, the (x, y) coordinate information on the transparent display device 100 according to the second touch input, and the screen size information of the transparent display device 100 to the external device 110.
The screen type information may include information representing whether the screen type of the external device 110 is a touch type or a non-touch type. If the screen of the external device 110 is the touch type screen, the external device 110 may recognize a region overlapping the transparent display device 100 and the external device 110. Accordingly, the first touch input may not include information relating to the contour line of the external device 110, but may only include the information representing that the screen of the external device 110 is the touch type screen.
In addition, in operation S201 of
The second touch input may be based on at least one of a touch operation, that is, touching an arbitrary point on a contour line of an object that is seen through the transparent display device 100 and dragging the touched location along the contour line of the object, and a touch operation of writing along the object (for example, text) that is seen through the transparent display device 100. The second touch input may be referred to as a touch input on the screen of the transparent display device 100 indicating position information (or second position information) of the object displayed on the screen of the external device 110. The second touch input may be referred to as an input that selects a position of the screen of the external device 110 viewable through the transparent display device 100. The object is displayed on the screen of the external device 110 at the position. The position comprises one of a coordinate position of the screen of the external device 110 viewable through the transparent display device 100 and an area of the screen of the external device 110 viewable through the transparent display device 100.
That is, referring to
Also, the touch operation between the start point and the end point of the touch operation for drawing along the contour line of the object may be performed continuously or discontinuously. If the touch operation is performed discontinuously, the end point of the touch operation for drawing along the contour line of the object may be changed.
For example, the touch operation for drawing along the contour line of the object in
That is, when the screen displayed on the external device 110 is shown as
The transparent display device 100 detects relational information between (x, y) coordinate information on the transparent display device 100 according to the first touch input and (x, y) coordinate information on the transparent display device 100 according to the second touch input in
For example, when the coordinate information according to the first touch input (801) includes coordinates from (x(1), y(1)) to (x(1+m), y(1+m)), the coordinate information according to the second touch input (802) includes coordinates from (x(i), y(i)) to x((i+j), y(i+j)), and the number of pieces of the coordinate information obtained by the first touch input and the number of pieces of the coordinate information obtained by the second touch input are equal to each other, the transparent display device 100 may obtain relational information from coordinates (x(1)−x(i), y(1)−y(i)) to (x(1+m)−x(i+j), y(1+m)−y(i+j)). Here, m, i, and j are natural numbers that are equal to or greater than 2.
However, the number of pieces of coordinate information on the transparent display device 100 according to the first touch input and the number of pieces of coordinate information on the transparent display device 100 according to the second touch input may be different from each other. In this case, the transparent display device 100 may detect the above relational information by sampling the coordinate information obtained by the first touch input and the coordinate information obtained by the second touch input. A target to be sampled may be determined according to the display location thereof.
Next, as shown in
Accordingly, the image of the object displayed on the transparent display device 100 may include the information of sensing the second touch input (802) in
As shown in
The transparent display device 100 detects and stores the coordinate information on the transparent display device 100 according to the first touch input and the second touch input in the process shown in
As shown in
The transparent display device 100 obtains coordinate information on the transparent display device 100 according to the first touch input (807) and the second touch input (808) in
When the transparent display device 100 and the external device 110 overlap each other as shown in
According to displaying information of sensing the second touch inputs in the processes shown in
Also, the processes shown in
Referring to
The information about the external device 110 based on the augmented reality service may be provided using an access point. When the transparent display device 100 and the external device 110 are located within a communication area of the same access point, physical locations of the transparent display device 100 and the external device 110 may be estimated by using an indoor sensor capable of estimating a physical location of a device such as a geomagnetic sensor, an acceleration sensor, a gyro sensor, and an altitude sensor mounted in the device. Thus, the information about the external device 110 based on the augmented reality service may be provided from the above described other external device or the server according to the estimated physical locations.
Otherwise, the transparent display device 100 receives or reads, from the external device 110, information that is necessary for receiving information based on the augmented reality service about the external device 110 (for example, mark information for recognizing the external device 110) using short distance communication such as NFC, and then, collects and displays the information based on the augmented reality service about the external device 110 from the server or the above described other external device.
The information about the external device 110 seen through the transparent display device 100 based on the augmented reality service may include a name of the device, a name of the owner, and contents of the external device, which may be shared with other devices, as shown in
When displaying the information about the external device 110 based on the augmented reality service as shown in
According to receipt of the first touch input, information about a shared folder may be displayed on the transparent display device 100 as shown in
When the second touch input according to the operation of setting the touch-based closed region or tap-based touch operation on a desired folder is received based on the information about the shared folder displayed on the transparent display device 100 as shown in
The transparent display device 100 may perform the second touch input operation by an operation of setting a touch-based closed region on a desired picture from among the available pictures shown in
Meanwhile, in operation S202 of
Otherwise, the signal requesting the information related to the object may include coordinate information on the external device 110 according to the first and second touch inputs, wherein the coordinate information is converted by using the coordinate information converting function of the transparent display device 100, and/or relational information between the coordinate information. The coordinate information on the external device 110 according to the second touch input may be coordinate information of the object that is displayed on the external device 110.
The signal requesting the information related to the object may further include a signal requesting relation information with the object. The signal requesting relation information with the object may include, for example, information for requesting a folder and objects included in the folder, when the object selected according to the second touch input is the folder. The objects included in the folder may be referred to as objects that are not displayed on the external device 110.
The signal requesting the information related to the object may include coordinate information on the transparent display device 100 according to the first and second touch inputs, and screen size information of the transparent display device 100. In this case, the external device 110 may detect coordinate information on the external device 110 according to the first and second inputs based on the information transmitted from the transparent display device 100 and the screen information of the external device 110. The coordinate information on the external device 110 may be detected by the processes described with reference to
The signal requesting the information related to the object may include various pieces of information that may be estimated by the examples of the first and second touch inputs described with reference to
In operation S203, the transparent display device 100 receives information related to the selected object from the external device 110, and in operation S204, the transparent display device 100 displays the received information related to the object on the transparent display device 100.
The information related to the object may include at least one other object having the same display type as that of the object selected by the second touch input. The other object has a different display location on the external device 110 from that of the selected object. That is, as shown in
Here, the display locations of the received information on the transparent display device 100 may similarly correspond to the display locations on the external device 110. If there are a plurality pieces of received information, the transparent display device 100 receives information about display coordinates on the external device 110, detects information about display coordinates on the transparent display device 100 by using the screen size information of the transparent display device 100 and the display coordinate information transmitted from the external device 110, and displays the plurality of objects by using the detected coordinate information. The coordinate information may be detected by the coordinate information converting operation that is described above.
However, the external device 110 may detect information about coordinates on the transparent display device 100 by using the screen size information of the transparent display device 100 and the information about the display coordinates of the plurality pieces of the object information on the external device 110, and may transmit the detected coordinate information and the object information to the transparent display device 100. Then, the transparent display device 100 may display the objects based on the received coordinate information.
In operation S203, the information about the selected object transmitted from the external device 110 may include information relating to the selected object. The information relating to the object may include information that is not displayed on the external device 110 (for example, information about objects included in a folder) as described above.
In operation S204, displaying the received information on the transparent display device 100 may include displaying the received information at similar locations as those of the external device 110 as shown in
In operation S1101, the transparent display device 100 receives a first touch input and a second touch input. The first and second touch inputs are the same as those described with reference to
In operation S1102, the transparent display device 100 requests the external device 110 for information related to an object based on the first and second touch inputs. The request for the information related to the object is the same as that described in operation S201 of
In operation S1103, the transparent display device 100 receives information corresponding to the request from the external device 110. The information related to the object that is received is the same as that described in operation S203.
In operation S1104, the transparent display device 100 displays the received information the transparent display device 100.
In operation S1105, the transparent display device 100 edits the received information that is displayed on the transparent display device 100 according to a user input.
That is, when the received object is displayed on the transparent display device 100 as shown in
The user inputs 701 and 702 may be received via various touch-based operations. For example, the user inputs 701 and 702 may be performed as various touch-based operations, for example, a touch-based operation for long-touching the object to be moved and dragging the object to a desired location of the object to be combined, a touch-based operation for long touching the object to be moved and long-touching the desired location of the object to be combined, a touch-based operation for setting a touch-based closed region on the object to be moved and long-touching the desired location of the object to be combined, or a touch-based operation for setting a touch-based closed region on the object to be moved, setting a touch-based closed region on the desired location of the object to be combined, and connecting the closed regions.
The editing operation in operation S1105 is not limited to the combination of the objects as shown in
The transparent display device 100 may perform the above editing operation based on an interaction with the external device 110. Accordingly, the information displayed on the external device 110 may reflect the editing result in the transparent display device 100 in real-time. The editing result may be stored in the external device 110 only, in the transparent display device 100 only, or in both the devices 100 and 110, according to the user input in the transparent display device 100.
In operation S1201, a touch input for selecting an object displayed on the external device 110 that is seen through the transparent display device 100 is received. Here, the received touch input may correspond to the second touch input described with reference to
In operation S1202, front and rear surface portions of the transparent display device 100 to which the touch input is received are deformed to protrude toward the external device 110.
Meanwhile, in operation S1203, the transparent display device 100 receives information related to the selected object from the external device 110 based on the touch input due to the contact between the rear surface portion 1302 of the transparent display device 100 and the external device 110.
In operation S1204, the transparent display device 100 displays the received information.
In operation S1401, the transparent display device 100 receives the information related to the object displayed on the external device 110 and the screen size information of the external device 110 via the local area wireless network. The local area wireless communication may include NFC, Bluetooth communication, Wi-Fi direct communication and IR association communication, but is not limited thereto.
In operation S1402, the transparent display device 100 checks whether the transparent display device 100 overlaps the external device 110. The checking in the operation S1402 may including checking the intention of the user to display the object displayed on the external device 110 that is seen through the transparent display device 100 on the transparent display device 100 according to the touch input to the transparent display device 100. The intention of the user may be interpreted as the intention to select an object to be displayed on the transparent display device 100.
The checking operation may be performed by disposing a contact sensor on the rear surface portion of the transparent display device 100 or transmitting a sensing result sensed by a contact sensor disposed on a front surface portion of the external device 110 to the transparent display device 100 via the local area wireless communication, but is not limited thereto.
Also, in the operation S1402, the transparent display device 100 and the external device 110 may overlap so that the external device 110 may be included within the screen of the transparent display device 100 when the external device 110 is smaller as shown in
In operation S1402, if it is determined that the transparent display device 100 and the external device 110 overlap each other, the transparent display device 100 displays the information related to the object displayed on the external device 110 by using the information transmitted via the local area wireless communication according to the user input in operation S1403. The user input in the operation S1403 may include a request for displaying the object displayed on the external device 110 that is seen through the transparent display device 100, but is not limited thereto.
In operation S1501, the transparent display device 100 photographs an object displayed on the external device 110 by using the camera function.
In operation S1502, the transparent display device 100 determines whether the transparent display device 100 and the external device 110 overlap each other. Determining whether the transparent display device 100 and the external device 110 overlap each other may be performed in the same manner as that of operation S1402 described above.
In operation S1503, if it is determined that the transparent display device 100 and the external device 110 overlap each other, the transparent display device 100 displays the object displayed on the external device 110 that is photographed according to the user input. The user input may include a request for outputting the object displayed on the photographed external device 110, but is not limited thereto.
Referring to
The transparent display 1610 is configured so that the object displayed on a screen of the external device 110 may be seen through the transparent display 1610 and may be configured to receive a touch-based input. The transparent display unit 1610 may be formed in various types, for example, a transparent liquid crystal display (LCD) type, a transparent thin-film electroluminescent panel (TFEL) type, a transparent OLED type, or a projection type. Hereinafter, examples of the structure of the transparent display 1610 will be described below.
The transparent LCD type is a transparent display device formed by removing a backlight unit from a currently used LCD device and using a pair of polarization plates, an optical film, a transparent thin film transistor (TFT), and a transparent electrode. The transparent display device may be referred to as a transparent display. In case of the transparent LCD device, a transmittance is degraded due to the polarization plates or the optical film and optical efficiency is reduced since peripheral light is used instead of the backlight unit; however, a large size transparent display may be realized.
The transparent TFEL type is a transparent display device using an alternating current (AC) type inorganic thin film EL display (AC-TFEL) including a transparent electrode, an inorganic phosphor, and an insulating film. The AC-TFEL emits light when accelerated electrons pass through the inorganic phosphor to excite the phosphor. If the transparent display unit 1610 is the transparent TFEL type, the processor 1640 may adjust the electrons to be projected to an appropriate location to determine a location displaying the information. Since the inorganic phosphor and the insulating film are transparent, the transparent display may be easily obtained.
Otherwise, the transparent OLED type is a transparent display device using an OLED that emits light by itself. Since an organic emission layer is transparent, the OLED may serve as the transparent display device provided that both electrodes are realized as transparent electrodes. In the OLED, electrons and holes are injected from both sides of the organic emission layer to be combined in the organic emission layer and emit light. The transparent OLED device may display the information by injecting the electrons and holes to desired locations.
Referring to
The transparent substrate 1702 may be formed of a polymer material that is transparent such as plastic, or a glass material. The material forming the transparent substrate 1702 may be determined according to environment in which the transparent display device 100 is used. For example, the polymer material is light and flexible, and thus may be applied to a portable display device. The glass material may be applied to show windows or general windows.
The transparent transistor layer 1703 is a layer including a transistor that is fabricated by replacing opaque silicon used in a conventional TFT with an organic material such as transparent zinc oxide or titanium oxide. In the transparent transistor layer 1703, a source, a gate, a drain, and various dielectric layers 1708 and 1709 are formed, and the connection electrode 1707 for electrically connecting the drain to the first transparent electrode 1704 may be formed. The transparent transistor layer 1703 includes a plurality of transparent transistors that are distributed throughout the entire display surface of the transparent display device 100. The processor 1640 applies a control signal to the gate in each of the transistors in the transparent transistor layer 1703 to drive the corresponding transparent transistor and display information.
The first transparent electrode 1704 and the second transparent electrode 1706 are disposed at opposite sides to each other while the transparent organic emission layer 1705 is interposed. The first transparent electrode 1704, the transparent organic emission layer 1705, and the second transparent electrode 1706 form an organic light-emitting diode (OLED).
The transparent OLED may be classified as a passive matrix OLED (PMOLED) and an active matrix OLED (AMOLED) according to a driving method thereof. The PMOLED has a structure in which cross points between the first and second transparent electrodes 1704 and 1706 form pixels. However, in the AMOLED, a TFT is disposed to drive each of the pixels.
Each of the first and second transparent electrodes 1704 and 1706 includes a plurality of line electrodes, arranged perpendicularly to each other. For example, if the line electrodes of the first transparent electrode 1704 are arranged in a transverse direction, the line electrodes of the second transparent electrode 1706 are arranged in a longitudinal direction. Accordingly, there are a plurality of crossing areas formed between the first and second transparent electrodes 1704 and 1706. The transparent transistor is connected to each of the crossing areas.
The processor 1640 generates a potential difference in each of the crossing areas by using the transparent transistor. The electrons and holes are induced to the transparent organic emission layer 1705 from the first and second electrodes 1704 and 1706 within the crossing area where the potential difference is generated, and then, combined with each other to emit light. On the other hand, the crossing area where the potential difference is not generated does not emit light, and accordingly, background image of the rear surface is seen as it is.
Indium tin oxide (ITO) may be used as the first and second transparent electrodes 1704 and 1706. Otherwise, a newly developed material such as graphene may be used. Graphene is a material of a honeycomb-shaped plane structure in which carbon atoms are connected to each other and having a transparent property. Otherwise, the transparent organic emission layer 1705 may be formed of various materials.
In addition, as described above, the transparent display 1610 may be formed as the projection type, as well as the transparent LCD type, the transparent TFEL type, and the transparent OLED type. The projection type is a method of displaying an image by projecting the image to a transparent screen such as a head-up display.
Also, the transparent display 1610 may be a dual-touchable touch screen, or may be a touch screen, a front surface of which is only touchable.
The transparent display 1610 displays information including the object processed in the transparent display device 100. The information may include information except for the object. The information except for the object may denote information that is displayed, but may not be selected by the user input.
The transparent display 1610 is formed as a transparent device, and a transparency of the transparent display 1610 may be adjusted by adjusting light transmittance of the transparent device or by adjusting RGB value of each pixel.
Also, the transparent display 1610 may have a structure in which an OLED and an LCD are combined. In the transparent display 1610, the OLED may be located adjacent to a front surface input portion, and the LCD may be located adjacent to a rear surface input portion. In a case where the transparent display 1610 has the above combined structure, the transparent display 1610 maintains a transparent state such as the glass during power-off status, and when power is applied, the LCD blocks the light so that the transparent display 1610 becomes opaque.
The transparent display 1610 receives a touch input of the user through the front surface input unit. The screen displayed on the transparent display 1610 may include a user interface (UI) or a graphic user interface (GUI). Also, the transparent display 1610 may receive and display the information related to the object from the external device 110 according to the touch input (the first and second touch inputs) of the user on the object displayed on the external device 110 that is seen through the transparent display unit 1610.
The storage 1620 stores at least one program that is configured to execute the information display method in the transparent display 1610. The storage unit 1620 may include a non-volatile memory such as a high speed random access memory, a magnetic disk storage device, or a flash memory, or other non-volatile semiconductor memories.
Referring to
The storage module 1810 includes a system database 1811 that is a storage for storing general data such as address book and environmental information, and a touch mode data region 1812 for storing setting values for touch modes of the object that will be displayed on the transparent display 1610.
The sensor recognition module 1820 includes a module 1821 for sensing a touch on the transparent display 1610, and a module 1822 for classifying the input touch. The module 1822 for classifying the input touch may classify the touch input as a front input mode 1823 for transferring an input on the front surface input interface to an event processor X11, a rear input mode 1824 for transferring an input on a rear surface input interface to the event processor X11, and a dual mode 1825 for transferring a dual-touch input (both-touch input of the front surface input interface and the real surface input interface) to the event processor X11. However, the sensor recognition module 1820 may be configured by an input mode for only transferring the input on the front surface of the transparent display 1610 to the event processor X11.
The communication module 1830 may include a telephony module 1840 and a messaging module 1850, but is not limited thereto.
The telephony module 1840 includes an information collection module 1842 for connecting a phone call, and a voice service module 1841 for transmitting voice over the Internet based on voice over Internet protocol (VoIP).
The messaging module 1850 includes an instant module 18S1 regarding conversation between users through an Internet connection, a module 1852 regarding short message service (SMS) text messages and multimedia messages, and a module 1853 for emailing.
The input/output module 1860 includes a UI & graphic module 1861, and a multimedia module 1865.
The UI & graphic module 1861 includes an X11 module 1862 for receiving a touch input by a window manager, a module 1863 that outputs all objects seen by a user on a screen, and an evaluation module 1864 regarding a mode setting value stored for each object and a current touch input.
The multimedia module 1865 includes a moving picture reproducing module 1866, a moving picture and still image capturing module 1867, and a voice reproducing module 1868.
The programs for executing the information display method according to the exemplary embodiments may be stored in the storage module 1871. The storage module 1871 may store various applications.
As described above, the storage 1620 may store programs of various configurations, and is not limited to the example shown in
The communication interface 1630 may communicate with at least one of the external device 110, the server 120, and the AP 130. To perform communication, the communication interface 1630 may be configured to transmit/receive data via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network; however, the present invention is not limited thereto. In particular, the communication interface 1630 may include a global positioning system (GPS) module.
The processor 1640 may perform operations according to the above described exemplary embodiments by executing the programs stored in the storage 1620. The processor 1640 receives a first touch input representing reference information with respect to the external device 110 that is seen through the transparent display 1610, and a second touch input representing a selection on an object displayed on the external device 110. The processor 1640 requests information related to the object to the external device 110 based on the first and second touch inputs received via the communication interface 1630. When receiving the information related to the object from the external device 110 via the communication interface 1630, the processor 1640 displays the received information on the transparent display 1610.
Operations of the processor 1640 regarding the information display method according to the exemplary embodiments may be performed as described with reference to the flowcharts in
The sensor 1650 senses a current status of the transparent display device 100 such as location of the transparent display device 100, contact of the user on the transparent display device 100, orientation of the transparent display device 100, and acceleration or deceleration of the transparent display device 100 and generates a sensing signal for controlling operations of the transparent display device 100. In particular, the sensor 1650 may generate a sensing signal regarding the location of the transparent display device 100 in order to receive information based on the augmented reality service described with reference to
Referring to
The transparent display 1901 may be referred to as a touch screen. The transparent display 1901 may display objects, and may receive a touch-based user input. The transparent display 1901 may receive the touch-based user input via at least one of a front surface and a rear surface of the transparent display 1901. To do this, the transparent display 1901 includes at least one touch sensor. The touch sensor may recognize the user input based on (x, y) coordinates. The touch sensor may include a sensor for recognizing a direct-touch, or a sensor for recognizing a proximity-touch.
The user input may be generated according to a request of a user based on gestures of the user, or user's selection. The gesture of the user may be variously defined by combinations of the number of touches, touch patterns, touch area, and touch intensity.
As described above with reference to the transparent display 1610 of
The transparent display 1901 may display a screen respectively responding to a touch-based user input through at least one of the front and rear surfaces thereof, a user input based on the sensor 1903, a user input via the camera 1904, and a user input via the audio input interface 1908. The screen displayed on the transparent display 1901 may include a UI or a GUI screen.
The transparent display 1901 may have a physical structure like the transparent display 1610 described with reference to
The user input interface 1902 generates input data (or control data) for controlling operations of the transparent display device 100 and a user input. The user input interface 1902 may include a keypad, a dome switch, a touch pad that is used instead of a mouse, a jog wheel, a jog switch, and a hardware (H/W) button.
The sensor 1903, like the sensor 1650 shown in
The sensor 1903 may include a sensor except for the sensors for sensing the direct touch or the proximate touch described regarding the transparent display 1901. For example, the sensor 1903 may include a proximity sensor. The proximity sensor is a sensor for detecting whether an object approaches a previously set defection surface or whether the external object is present nearby by using a force of an electromagnetic field or an infrared ray without an actual physical touch. Examples of the proximity sensor include a transparent photoelectric sensor, a direct reflective photoelectric sensor, a mirror reflective photoelectric sensor, a high frequency oscillation photoelectric sensor, a capacitive photoelectric sensor, a magnetic photoelectric sensor, an infrared photoelectric sensor, etc.
The camera 1904 processes an image frame such as a still image or a moving image obtained from an image sensor in a conference call mode or a photographing mode. The processed image frame may be displayed on the transparent display 1901. The image frame processed by the camera 1904 may be stored in the storage 1905 or may be transmitted to another device through the communication interface 1906 or the port 1907. The device receiving the transmitted image frame may include at least one of the external device 110, the server 120, and the AP 130, but is not limited thereto.
The camera 1904 may also be configured to receive the user input to the front and rear surfaces of the transparent display 1901 or to photograph the object. The number of cameras 1904 may be two or more according to a structure of the transparent display device 100. The camera 1904 may be used as an input apparatus that recognizes a user's spatial gesture.
The storage 1905 stores at least one program configured to be executed by the processor 1912, which will be described later, and a resource. The at least one program includes a program that executes an information display method, an operating system (OS) program of the transparent display device 100, applications set in the transparent display device 100, and a program necessary for performing various functions (for example, communication function and display function) of the transparent display device 100.
The resource includes information necessary for executing the above-described programs, user interface screen information for performing the information display method mentioned in embodiments of the present invention, and the user input information recognized by the first and second touch inputs. The user input information recognized as the first and second touch inputs may be set based on the examples described with reference to
The storage 1905 may be configured to independently include a storage that stores at least one program necessary for performing various functions of the transparent display device 100 and an operating system program, and a storage that stores one or more programs, resources, and various applications that execute the information display method.
The storage 1905 may include at least one type of storage medium among a flash memory type, a hard disk type, a multimedia card micro type, and a card type memory (for example, an SD or XD memory), a read only memory (ROM), an electronically erasable programmable read-only memory (EEPROM), a programmable read only memory (PROM) magnetic memory, and an optical disk.
The communication interface 1906 may be configured to transmit data to and receive data from at least one of the external device 110, a server (120), and AP (130) via a wireless communication network such as wireless Internet, wireless Intranet, a wireless phone network, a wireless local area network (LAN), a Wi-Fi network, a Wi-Fi direct (WFD) network, a 3G network, a 4G Long Term Evolution (LTE) network, a Bluetooth network, an infrared data association (IrDA) network, a radio frequency identification (RFID) network, a ultra wideband (UWB) network, a Zigbee network, or a near field communication (NFC) network, but is not limited thereto.
The communication interface 1906 may include at least one of a broadcasting reception module, a mobile communication module, a wireless Internet module, a wired Internet module, a short distance communication module, and a location information module but is not limited thereto.
The broadcasting reception module receives a broadcasting signal and/or broadcasting related information from an external broadcasting management server through a broadcasting channel. The broadcasting channel may include a satellite channel and a terrestrial channel. The mobile communication module transmits and receives a wireless signal to and from at least one of a base station, the external device 110, and the server 120 over a mobile communication network. The wireless signal may include various types of data according to a voice call signal, a conference phone call, or transmission/reception of a text/multimedia message. The wireless Internet module is a module for a wireless Internet connection. The wired Internet module is a module for a wired Internet connection.
The short distance communication module is a module for short distance communication. Short distance communication technologies may use Bluetooth, RFID, IrDA, UWB, Zigbee, WFD, NFC, etc. Like the exemplary embodiment shown in
The location information module is a module for identifying or obtaining the location of the transparent display device 100. As an example, a GPS module may be used. The GPS module receives location information from a plurality of satellites. The location information may include coordinate information represented by latitude and longitude.
The port 1907 may transmit and receive data to and from the external device 110 by using a plug and play interface such as a USB port. The plug and play interface is a module that automatically detects and enables use of (i.e., play) if the external device 110 is plugged into the transparent display device 100. The device is not limited to the external device 110.
The audio input interface 1908 receives an input of an external sound signal in a call mode, a recording mode, or a voice recognition mode, etc. The audio input interface 1908 may be configured as, for example, a microphone. The audio input interface 1908 may be configured to include various noise removal algorithms for removing noise that occurs during the process of receiving the input of the external sound signal.
The sound signal input by using the audio input interface 1908 may be user's input representing a selection on the object displayed on the external device 110 that is seen through the transparent display unit 1901 according to an exemplary embodiment. The sound signal input by using the audio input interface 1908 may be stored in the storage 1905 or may be transmitted to the outside through the communication interface 1906 or the port 1907. The outside may include the external device 110, other external devices (not shown), the server 120, and the AP 130.
The audio signal processing unit 1909 provides an interface between the audio input interface 1908 and the processor 1912 and between the audio output interface 1910 and the processor 1912. That is, the audio signal processing unit 1909 converts the sound signal received from the audio input interface 1908 into audio data that may be processed by the processor 1912 and transmits the audio data to the processor 1912. The audio signal processing unit 1909 converts the audio data transmitted from the processor 1912 into an electrical sound signal and transmits the electrical sound signal to the audio output interface 1910.
The audio output interface 1910 outputs the sound signal or the audio signal received from the audio signal processing unit 1909 in the call mode or an audio production mode. The audio signal output interface 1910 may be configured as a speaker. The audio input interface 1908 and the audio output interface 1910 may be integrally configured like a head set.
The transparent display 1901, the user input interface 1902, the sensor 1903, the camera 1904, and the audio input interface 1908 may be referred to as input apparatuses or input/output apparatuses according to a function of a user interface between the transparent display device 100 and the user. For example, in a case where the function of the user interface between the transparent display device 100 and the user includes a touch screen function, a sound recognition function, and a spatial gesture recognition function, the user input interface 1902, the sensor 1903, the camera 1904, and the audio input interface 1908 may be referred to as the input apparatuses, and the transparent display 1901 may be referred to as the input/output apparatus.
The power supply 1911 supplies power to various elements of the transparent display device 100. The power supply 1911 includes one or more power sources such as a battery and an alternating current (AC) power source. The transparent display device 100 may not include the power supply 1911 but may include a connection unit (not shown) that may be connected to an external power supply (not shown).
The processor 1912 may be referred to as one or more processors that control a general operation of the transparent display device 100. Although the processor 1912 is implemented as a single chip in
The processor 1912 may generally control the transparent display 1901, the user input interface 1902, the sensor 1903, the camera 1904, the storage 1905, the communication interface 1906, the port 1907, the audio input interface 1908, the audio signal processing unit 1909, and the audio output interface 1910. Thus, the processor 1912 may be referred to as a controller, a microprocessor, a digital signal processor, etc.
The processor 1912 may also provide user's input that is input through the transparent display 1901, the user input interface 1902, the sensor 1903, the camera 1904, and the audio input interface 1908 that correspond to input apparatuses and a user interface based on the transparent display 1901.
The processor 1912 may execute at least one program related to the information display method according to the exemplary embodiments. The processor 1912 may execute the program by reading the program from the storage 1905 or downloading the program from an external apparatus such as an application providing server (not shown) or a market server (not shown) through the communication interface 1906.
The processor 1912 may be understood to include an interface function unit interfacing between various functional modules and the processor 1912 of the transparent display device 100. The operation of the processor 1912 related to the information display method according to the exemplary embodiments may be performed as shown in flowcharts of
In operation S2001, the external device 110 receives a request for information related to at least on object displayed on the external device 110 from the transparent display device 100. The request for information may be transmitted via at least one of the direct communication between the devices, the communication via a server, and the communication via a repeater.
The request for information related to the object may be input based on the first touch input and the second touch input to the transparent display device 100. The first touch input is a user input to the transparent display device 100 for representing the reference information about the external device 110 that is seen through the transparent display device 100. The second touch input is a user input to the transparent display device 100 for selecting at least one object displayed on the external device 110 that is seen through the transparent display device 100.
The request for information related to the object may include the displayed location information (coordinate information) of the selected object on the external device 110 as described in the above exemplary embodiments, the screen size of the transparent display device 100, and the coordinate information of the first and second touch inputs on the transparent display device 100, but is not limited thereto.
However, when the transparent display device 100 operates as the flowchart shown in
In operation S2002, the external device 110 selects an object in response to the received request for the information related to the object. For example, if the requested object is an icon, the external device 110 selects the icon of the requested object and the application program connected to the icon. If the requested object is a folder, the external device 110 selects the requested folder and files or data located at a lower layer of the folder. If the requested object is an object included in one screen, the external device 110 selects the object by using the coordinate information included in the received request. If the requested object is a plurality of objects included in one screen, the external device 110 respectively selects the plurality of objects by using the coordinate information of the object included in the received request.
In operation S2003, the external device 110 transmits information related to the selected object to the transparent display device 100. The information related to the object is transmitted to the transparent display device 100 in the same manner as the request for the information is received, but is not limited thereto. For example, the request for the information may be received via the direct communication between the devices, and the information related to the object selected in response to the request may be transmitted to the transparent display device 100 via the repeater or the server.
In operation S2101, the transparent display device 100 receives a touch input for selecting an object displayed on the external device 110 that is seen through the transparent display device 100. Here, the touch input corresponds to the second touch input mentioned in
In operation S2102, the transparent display device 100 requests the external device 110 for information related to the object selected based on the touch input. A signal for requesting information related to the object selected based on the touch input transmitted to the external device 110 includes a signal for requesting information related to the object selected based on the second touch input mentioned in
In operation S2103, the transparent display device 100 receives information about the selected object from the external device 110. The received information corresponds to the request signal in operation S2102, and may be the same as the information received in operation S203 shown in
In operation S2104, the transparent display device 100 displays the received information. The received information may be displayed in the same manner as that of operation S204 shown in
The flowchart shown in
The information display method according to exemplary embodiments may also be embodied as computer readable codes on a computer readable recording medium. The computer readable medium may be any recording apparatus capable of storing data that is read by a computer system, e.g., a read-only memory (ROM), a random access memory (RAM), a compact disc (CD)-ROM, a magnetic tape, a floppy disk, an optical data storage device, and so on. The computer readable medium may be distributed among computer systems that are interconnected through a network, and the present invention may be stored and implemented as computer readable code in the distributed system.
While the exemplary embodiments have been particularly shown and described, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present invention as defined by the following claims.
Patent | Priority | Assignee | Title |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 07 2018 | Samsung Electronics Co., Ltd. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jun 07 2018 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jun 05 2019 | PTGR: Petition Related to Maintenance Fees Granted. |
May 20 2024 | REM: Maintenance Fee Reminder Mailed. |
Nov 04 2024 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Sep 29 2023 | 4 years fee payment window open |
Mar 29 2024 | 6 months grace period start (w surcharge) |
Sep 29 2024 | patent expiry (for year 4) |
Sep 29 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 29 2027 | 8 years fee payment window open |
Mar 29 2028 | 6 months grace period start (w surcharge) |
Sep 29 2028 | patent expiry (for year 8) |
Sep 29 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 29 2031 | 12 years fee payment window open |
Mar 29 2032 | 6 months grace period start (w surcharge) |
Sep 29 2032 | patent expiry (for year 12) |
Sep 29 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |