An information processing device (100) includes a command management unit (101) storing a command in association with a parameter, an instruction receiving unit (104) receiving a user operation performed on an icon displayed on a display screen, a display control unit (10) causing an object icon identifying an object to be displayed in a first display area of the display screen, and an instruction generation unit (115) generating an instruction which is a group of the command and the parameter. The display control unit (10) further causes a command icon identifying a command executable on an operation target object to be displayed in a second display area, and causes a parameter icon identifying a parameter associated with the selected command to be displayed in a third display area. When at least two command icons are selected, the instruction generation unit (115) generates an instruction such that at least two commands identified by these two command icons respectively are executed in order of selection.
|
10. An information processing method of an information processing device which generates an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen and which includes a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process, said information processing method comprising:
receiving the user operation performed on the icon displayed on the display screen;
performing control to cause at least one object icon, which identifies the object, to be displayed in a first display area of the display screen; and
generating an instruction which is a group of the command and the parameter stored in the command management unit, according to the user operation received in said receiving,
wherein, in said performing of the control, when an operation, in which at least one of the at least one object icon displayed in the first display area is selected, is received in said receiving, the command executable on an operation target object identified by the selected object icon is extracted from the command management unit and at least one command icon identifying the extracted command is caused to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen,
wherein, in said performing of the control, when an operation, in which at least one of the at least one command icon displayed in the second display area is selected, is received in said receiving, the parameter associated with the command identified by the selected command icon is extracted from the command management unit and at least one parameter icon identifying the extracted parameter is caused to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen,
wherein, in said generating, when at least two command icons are selected, an instruction is generated such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected,
wherein, said performing of the control includes selecting, according to a user operation, an object icon displayed in the first display area of the display screen, and dragging the selected object icon,
wherein, when the object icon that is selected and dragged overlays the at least one command icon displayed in the second display area, said receiving measures a first time period for which the selected object icon overlays the at least one command icon displayed in the second display area, and receives, upon detecting that the first measured time period has reached a first predetermined threshold or higher, an operation in which the at least one command icon displayed in the second display area is selected, and
wherein, when the object icon that is selected and dragged moves from the selected at least one command icon displayed in the second display area and overlays the at least one parameter icon displayed in the third display area, said receiving unit measures a second time period for which the selected object icon overlays the at least one parameter icon displayed in the third display area, and receives, upon detecting that the second measured time has reached a redetermined threshold or higher, an operation in which the at least one parameter icon displayed in the third display area is selected.
11. A non-transitory computer-readable recording medium having a program recorded thereon for causing an information processing device to generate an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen, the information processing device including a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object, and the parameter specifying a detail of the process, the program causing a computer to execute a method comprising:
receiving the user operation performed on the icon displayed on the display screen;
performing control to cause at least one object icon, which identifies the object, to be displayed in a first display area of the display screen; and
generating an instruction which is a group of the command and the parameter stored in the command management unit, according to the user operation received in said receiving,
wherein, in said performing of the control, when an operation, in which at least one of the at least one object icon displayed in the first display area is selected, is received in said receiving, the command executable on an operation target object identified by the selected object icon is extracted from the command management unit and at least one command icon identifying the extracted command is caused to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen,
wherein, in said performing of the control, when an operation, in which at least one of the at least one command icon displayed in the second display area is selected, is received in said receiving, the parameter associated with the command identified by the selected command icon is extracted from the command management unit and at least one parameter icon identifying the extracted parameter is caused to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen,
wherein, in said generating, when at least two command icons are selected, an instruction is generated such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected,
wherein, said performing of the control includes selecting, according to a user operation, an object icon displayed in the first display area of the display screen, and dragging the selected object icon,
wherein, when the object icon that is selected and dragged overlays the at least one command icon displayed in the second display area, said receiving measures a first time period for which the selected object icon overlays the at least one command icon displayed in the second display area, and receives, upon detecting that the first measured time period has reached a first predetermined threshold or higher, an operation in which the at least one command icon displayed in the second display area is selected, and
wherein, when the object icon that is selected and dragged moves from the selected at least one command icon displayed in the second display area and overlays the at least one parameter icon displayed in the third display area, said receiving unit measures a second time period for which the selected object icon overlays the at least one parameter icon displayed in the third display area, and receives, upon detecting that the second measured time has reached a predetermined threshold or higher, an operation in which the at least one parameter icon displayed in the third display area is selected.
12. An integrated circuit implemented on an information processing device which generates an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen and which includes a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process, said integrated circuit comprising:
an instruction receiving unit configured to receive the user operation performed on the icon displayed on the display screen;
a display control unit configured to cause at least one object icon, which identifies the object, to be displayed in a first display area of the display screen; and
an instruction generation unit configured to generate an instruction which is a group of the command and the parameter stored in said command management unit, according to the user operation received by said instruction receiving unit,
wherein, when said instruction receiving unit receives an operation in which at least one of the at least one object icon displayed in the first display area is selected, said display control unit is further configured to extract, from said command management unit, the command executable on an operation target object identified by the selected object icon and to cause at least one command icon identifying the extracted command to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen,
wherein, when said instruction receiving unit receives an operation in which at least one of the at least one command icon displayed in the second display area is selected, said display control unit is further configured to extract, from said command management unit, the parameter associated with the command identified by the selected command icon and to cause at least one parameter icon identifying the extracted parameter to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen,
wherein, when at least two command icons are selected, said instruction generation unit is configured to generate an instruction such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected,
wherein said display control unit is configured to, according to a user operation, select an object icon displayed in the first display area of the display screen, and drag the selected object icon,
wherein, when the object icon that is selected and dragged overlays the at least one command icon displayed in the second display area, said instruction receiving unit is configured to measure a first time period for which the selected object icon overlays the at least one command icon displayed in the second display area, and receive, upon detecting that the first measured time period has reached a first predetermined threshold or higher, an operation in which the at least one command icon displayed in the second display area is selected, and
wherein, when the object icon that is selected and dragged moves from the selected at least one command icon displayed in the second display area and overlays the at least one parameter icon displayed in the third display area, said instruction receiving unit is configured to measure a second time period for which the selected object icon overlays the at least one parameter icon displayed in the third display area, and receive, upon detecting that the second measured time has reached a predetermined threshold or higher, an operation in which the at least one parameter icon displayed in the third display area is selected.
1. An information processing device which generates an instruction specifying an operation of a device, in response to a user operation performed on an icon displayed on a display screen, said information processing device comprising:
a microprocessor;
at least one memory;
a storage unit including a command management unit configured to store at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process;
an instruction receiving unit configured to receive the user operation performed on the icon displayed on the display screen;
a display control unit configured to cause at least one object icon, which identifies the object to be displayed in a first display area of the display screen; and
an instruction generation unit configured to generate an instruction which is a group of the command and the parameter stored in said command management unit, according to the user operation received by said instruction receiving unit,
wherein, when said instruction receiving unit receives an operation in which at least one of the at least one object icon displayed in the first display area is selected, said display control unit is further configured to extract, from said command management unit, the command executable on an operation target object identified by the selected object icon and to cause at least one command icon identifying the extracted command to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen,
wherein, when said instruction receiving unit receives an operation in which at least one of the at least one command icon displayed in the second display area is selected, said display control unit is further configured to extract, from said command management unit, the parameter associated with the command identified by the selected command icon and to cause at least one parameter icon identifying the extracted parameter to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen,
wherein, when at least two command icons are selected, said instruction generation unit is configured to generate an instruction such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected,
wherein said display control unit is configured to, according to a user operation, select an object icon displayed in the first display area of the display screen, and drag the selected object icon,
wherein, when the object icon that is selected and dragged overlays the at least one command icon displayed in the second display area, said instruction receiving unit is configured to measure a first time period for which the selected object icon overlays the at least one command icon displayed in the second display area, and receive, upon detecting that the first measured time period has reached a first predetermined threshold or higher, an operation in which the at least one command icon displayed in the second display area is selected, and
wherein, when the object icon that is selected and dragged moves from the selected at least one command icon displayed in the second display area and overlays the at least one parameter icon displayed in the third display area, said instruction receiving unit is configured to measure a second time period for which the selected object icon overlays the at least one parameter icon displayed in the third display area, and receive, upon detecting that the second measured time has reached a predetermined threshold or higher, an operation in which the at least one parameter icon displayed in the third display area is selected.
2. The information processing device according to
3. The information processing device according to
4. The information processing device according to
5. The information processing device according to
wherein said instruction generation unit is further configured to register the generated instruction as a processing pattern in a pattern management unit included in said storage unit, and
wherein said display control unit is further configured to extract, from said pattern management unit, the processing pattern executable on the object identified by the selected object icon and to cause a processing pattern icon identifying the extracted processing pattern to be displayed in a pattern display area different from the first, second, and third display areas.
6. The information processing device according to
the display screen; and
a detection unit configured to detect an orientation of the display screen,
wherein said display control unit is configured to change a position at which the icon is displayed on the display screen, according to the orientation of the display screen detected by said detection unit.
7. The information processing device according to
wherein the object is an external device connected to said information processing device, and
wherein said information processing device further comprises a communication unit configured to send the instruction generated by said instruction generation unit to the external device that is the operation target object.
8. The information processing device according to
wherein said communication unit is further configured to receive, from the operation target object, state information indicating a current operating state of the operation target object, and
wherein said display control unit is configured to extract, from said command management unit, the command that is executable on the operation target object in the current operating state indicated by the state information.
9. The information processing device according to
wherein, when the operation target object includes a display screen,
wherein said communication unit is configured to receive, from the operation target object, display layout information indicating a detail displayed on the display screen of the operation target object, and
wherein said display control unit is further configured to cause the detail indicated by the display layout information to be displayed in a virtual screen display area different from the first, second, and third display areas of the display screen.
|
The present invention relates to information processing devices, and particularly to an information processing device which generates an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen.
In recent years, display screens with touch panels have been introduced not only into public facilities such as ticket-vending machines, but also into various kinds of information devices used by individual users including a personal computer, a PDA (Personal Digital Assistant), a remote control of a home electrical appliance such as a TV, and a high-functionality mobile terminal called a smartphone. Many of these information devices have sensors on or near their screens. A user performs an operation by directly touching a link or button represented by characters or an image displayed on the screen of the information device. Using the sensor, the information device detects an operation request from the user. Thus, the user can cause the information device to execute a desired process through an intuitive operation.
Processes performed by the user using the information device include, as a first example, posting an article on a personal blog site from a mobile terminal. This posting process includes changing the size of an image captured using a camera function of the mobile terminal, changing the color of the image such as turning the image into a sepia tone, editing such as assigning a tag which is a keyword, and sending an email message with the edited image attached.
Here, a mobile terminal with a general touch panel is provided with a hierarchical menu designed for touch-panel use, so as to receive a user operation. The user touches a menu icon, that is displayed on the touch panel, for activating a menu screen. Or, the user presses a button, provided on the body of the mobile terminal, for activating the menu screen. As a result of this, the mobile terminal displays the menu screen. Moreover, when the user touches an icon for activating the camera function that is displayed on the menu screen, the camera function is activated.
Next, image capturing is started by means of a shutter icon displayed on the touch panel or a shutter button provided on the body of the mobile terminal, and the captured image is recorded. After this, the following processes are sequentially performed on the captured image which is an operation target object. The user selects “Image Edit” from “Submenu”. Then, after selecting “Resize” from “Submenu” that is displayed again, the user selects “Desired Size”. Accordingly, the mobile terminal makes a size adjustment to the image. Then, after selecting a color change menu such as “Retouch Photo” from “Submenu”, the user selects “Sepia Tone”.
As a result, the mobile terminal performs the color change process on the image. Next, after selecting “Assign Keyword” from “Submenu”, the user enters a keyword which the user desires to assign or selects a keyword from among candidates. Then, after selecting “Send Email with Attachment” from “Submenu”, the user performs processes to, for example, “Select Destination Address” and “Enter Title and Text” through touch operations (or operations by means of button keys). In this way, in order to accomplish one purpose, that is, posting an article to a blog site, plural touch operations are performed in a repetitive manner.
Such operations are routinely performed by the user with full knowledge about the operations of the mobile terminal. However, since the menu screens are hierarchically structured, the screens to be displayed are large in number regardless of the introduction of the touch panel. As the depth of the hierarchy increases, the number of touches on the touch panel (or, the number of operations performed on the operation keys) by the user tends to increase proportionately. That is to say, the increased depth of the hierarchy may have an opposite effect on convenience to the user.
When wishing to sequentially execute the processes which are based on plural non-sequential functions, the user needs to know in advance, after selecting a menu icon, about a submenu from which a next item is to be selected. Thus, in fact, the hierarchical structure of the menu screens may result in inconvenience to a user who is not good at operating the device or who has less knowledge about the terminal functions. In addition, even a skilled user has to repeat the same operations routinely due to this hierarchical structure of the menu screens.
A first method to solve the stated problem in the first example is disclosed in Patent Literature 1. To be more specific, a predetermined icon of a processing-command execution area is displayed on a screen display unit. By manipulating a cursor using a touch panel or mouse, the user drags an operation target object (i.e., an icon) representing data such as an image which is to be an operation target, through the processing-command execution area. As a result, the mobile terminal can perform the process corresponding to this processing-command execution area on the data represented by the operation target object.
Moreover, parameter candidates (for example, image zoom ranges) may be set in advance for each processing-command execution area. With this, near the area through which the operation target object (i.e., the icon) passes, the corresponding parameter candidates can be displayed. Then, the user can select a desired value from among the displayed plural parameter candidates.
Also, a method of executing a process by means of a script or batch file is widely known. When using this method, the user describes in advance plural processes in order, which the user wishes the computer to sequentially execute, in one description file according to a description rule. Accordingly, the plural processes are executed in order.
However, using the method disclosed in Patent Literature 1 explained above, the operation target object is limited to contents such as images and audio. Also, it is assumed that the process to be performed on the content is one at a time.
Here, it is possible to apply the above-explained method of executing the plural processes. However, with this method of executing the plural processes, the user has to voluntarily describe in advance the processes, which are to be executed on the computer, in a computer-interpretable language according to the description rule. On this account, this method cannot be considered highly convenient to the user who is not good at operating the device or who does not have enough knowledge about the terminal functions.
The present invention is conceived in view of the aforementioned problem, and has an object to provide an information processing device which is capable of executing a plurality of processes with intuitive operability.
The information processing device in an aspect of the present invention generates an instruction specifying an operation of a device, in response to a user operation performed on an icon displayed on a display screen. To be more specific, the information processing device includes: a storage unit including a command management unit which stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process; an instruction receiving unit which receives the user operation performed on the icon displayed on the display screen; a display control unit which causes at least one object icon identifying the object to be displayed in a first display area of the display screen; and an instruction generation unit which generates an instruction that is a group of the command and the parameter stored in the command management unit, according to the user operation received by the instruction receiving unit. When the instruction receiving unit receives an operation in which at least one of the at least one object icon displayed in the first display area is selected, the display control unit extracts, from the command management unit, the command executable on an operation target object identified by the selected object icon and causes at least one command icon identifying the extracted command to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen. When the instruction receiving unit receives an operation in which at least one of the at least one command icon displayed in the second display area is selected, the display control unit extracts, from the command management unit, the parameter associated with the command identified by the selected command icon and causes at least one parameter icon identifying the extracted parameter to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen. When at least two command icons are selected, the instruction generation unit generates an instruction such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected.
With this configuration, the object icon, the command icon, and the parameter icon are displayed on the same screen at the same time. Thus, the user can execute the instruction on the operation target object, through an intuitive operation.
Also, when the operation target object is selected, only the command icons representing the commands executable on the operation target object are displayed. Moreover, when the command is selected, only the parameter icons representing the parameters associated with the present command are displayed. Furthermore, a plurality of commands to be executed on the operation target object can be selected at one time. This results in improved convenience to the user.
Moreover, the information processing may further include an instruction execution unit which executes the instruction generated by the instruction generation unit on the operation target object. With this, the instruction generated for the content (i.e., the substantial data) held by the information processing device can be executed.
Furthermore, when at least two operation target objects are selected, the display control unit may extract the command that is executable on and common to the at least two operation target objects.
Also, the display control unit may correct spaces between a plurality of icons displayed on the display screen, according to a size of an area that remains unused for displaying an icon on the display screen. With this, a display area for an icon which is to be newly displayed and to be selected by the user is secured. This further improves the operability.
The instruction generation unit may register the generated instruction as a processing pattern into a pattern management unit included in the storage unit. The display control unit may extract, from the pattern management unit, the processing pattern executable on the object identified by the selected object icon and cause a processing pattern icon identifying the extracted processing pattern to be displayed in a pattern display area different from the first, second, and third display areas. This improves convenience to the user in a case, for example, where the same process is repeatedly executed.
Also, the information processing device may include: the display screen; and a detection unit which detects an orientation of the display screen. The display control unit may change a position at which the icon is displayed on the display screen, according to the orientation of the display screen detected by the detection unit. With this, the icons can be displayed in an appropriate layout, depending on the orientation of the display screen.
Moreover, the object may be an external device connected to the information processing device. Furthermore, the information processing device may include a communication unit which sends the instruction generated by the instruction generation unit to the external device that is the operation target object. With this, the information processing device at hand can generate an instruction and cause the external device to execute the instruction.
Also, the communication unit may receive, from the operation target object, state information indicating a current operating state of the operation target object. The display control unit may extract, so from the command management unit, the command that is executable on the operation target object in the current operating state indicated by the state information. With this, according to the operating state of the operation target device, only the icons of the executable commands are displayed. This can improve the efficiency of the user operation.
Also, when the operation target object includes a display screen: the communication unit may receive, from the operation target object, display layout information indicating a detail displayed on the display screen of the operation target object; and the display control unit may cause the detail indicated by the display layout information to be displayed in a virtual screen display area different from the first, second, and third display areas of the display screen. With this, the operation can be performed while the display screen of the operation target object and the virtual screen display area are used in cooperation. Accordingly, the operability is further improved.
An information processing method in an aspect of the present invention is a method of an information processing device which generates an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen and which includes a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process. To be more specific, the information processing method includes: receiving the user operation performed on the icon displayed on the display screen; performing control to cause at least one object icon identifying the object to be displayed in a first display area of the display screen; and generating an instruction which is a group of the command and the parameter stored in the command management unit, according to the user operation received in the receiving. In the performing control: when an operation in which at least one of the at least one object icon displayed in the first display area is selected is received in the receiving, the command executable on an operation target object identified by the selected object icon is extracted from the command management unit and at least one command icon identifying the extracted command is caused to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen; and when an operation in which at least one of the at least one command icon displayed in the second display area is selected is received in the receiving, the parameter associated with the command identified by the selected command icon is extracted from the command management unit and at least one parameter icon identifying the extracted parameter is caused to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen. In the generating, when at least two command icons are selected, an instruction is generated such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected.
A non-transitory computer-readable recording medium in an aspect of the present invention is a medium having a program recorded thereon for causing an information processing device to generate an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen, the information processing device including a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object, and the parameter specifying a detail of the process. The program causes the information processing device to execute: receiving the user operation performed on the icon displayed on the display screen; performing control to cause at least one object icon identifying the object to be displayed in a first display area of the display screen; and generating an instruction which is a group of the command and the parameter stored in the command management unit, according to the user operation received in the receiving. In the performing control: when an operation in which at least one of the at least one object icon displayed in the first display area is selected is received in the receiving, the command executable on an operation target object identified by the selected object icon is extracted from the command management unit and at least one command icon identifying the extracted command is caused to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen; and when an operation in which at least one of the at least one command icon displayed in the second display area is selected is received in the receiving, the parameter associated with the command identified by the selected command icon is extracted from the command management unit and at least one parameter icon identifying the extracted parameter is caused to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen. In the generating, when at least two command icons are selected, an instruction is generated such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected.
An integrated circuit in an aspect of the present invention is implemented on an information processing device which generates an instruction specifying an operation of a device in response to a user operation performed on an icon displayed on a display screen and which includes a storage unit having a command management unit that stores at least one command in association with at least one parameter, the command indicating a process to be executed on an object and the parameter specifying a detail of the process. To be more specific, the integrated circuit includes: an instruction receiving unit which receives the user operation performed on the icon displayed on the display screen; a display control unit which causes at least one object icon identifying the object to be displayed in a first display area of the display screen; and an instruction generation unit which generates an instruction which is a group of the command and the parameter stored in the command management unit, according to the user operation received by the instruction receiving unit. When the instruction receiving unit receives an operation in which at least one of the at least one object icon displayed in the first display area is selected, the display control unit extracts, from the command management unit, the command executable on an operation target object identified by the selected object icon and causes at least one command icon identifying the extracted command to be displayed in a second display area different from the first display area, without changing a detail displayed in the first display area of the display screen. When the instruction receiving unit receives an operation in which at least one of the at least one command icon displayed in the second display area is selected, the display control unit extracts, from the command management unit, the parameter associated with the command identified by the selected command icon and causes at least one parameter icon identifying the extracted parameter to be displayed in a third display area different from the first and second display areas, without changing details displayed in the first and second display areas of the display screen. When at least two command icons are selected, the instruction generation unit generates an instruction such that at least two commands identified by the at least two command icons respectively are executed in an order in which the at least two command icons are selected.
The present invention can provide an information processing device which is capable of executing a plurality of processes with intuitive operability.
The following describes embodiments of the present invention in detail, with reference to the drawings. It should be noted that components used in the following embodiments in common are assigned the same reference numerals and their explanations are not repeated.
The storage unit 20 includes a command management unit 101 and a data holding unit 102, and a pattern management unit 110. The storage unit 20 is a DB (Database) storing various kinds of data required for the operation performed by the information processing device 100.
The command management unit 101 holds at least one piece of command information. This command information holds a command in association with at least one parameter. The command (also referred to as the “processing command” hereafter) causes a process, corresponding to one of various functions provided for the information processing device 100, to be executed. The parameter specifies a processing detail of the present command. Moreover, the command information holds a command icon identifying the command and a parameter icon identifying the parameter.
The data holding unit 102 holds an object which is to be an operation target, in association with an object icon identifying the present object. It should be noted that the object described in the first embodiment refers to the substantial data that is held by the information processing device 100 typically in the form of files. Specific examples of the substantial data are not particularly limited and may include data in all kinds of formats, such as an image file, a video file, and a text file.
The pattern management unit 110 holds at least one pattern management table in which a group of commands and parameters specified by an operation instruction is registered as a processing pattern. That is to say, the pattern management table includes processing sequence data (namely, the processing pattern).
The display unit 103 displays an operation screen (a graphical user interface) that includes icons representing objects, commands, and parameters. The user selects an operation target object by choosing from among the object icons displayed on the display unit 103. Also, the user selects a command icon and a parameter icon corresponding to a process desired to be performed on the selected operation target object. As a result, the process desired to be performed on the operation target object can be executed.
It should be noted that a specific example of the display unit 103 is not particularly limited. For example, a liquid crystal display, a plasma display, or an organic EL (Electroluminescence) display can be adopted. Or, the information processing device 100 may not include the display unit 103 as a component, and may perform display control over an external display screen.
The instruction receiving unit 104 receives a user operation performed on an icon displayed on the display unit 103. To be more specific, when the user selects an icon displayed on the display unit 103, the instruction receiving unit 104 sends position information of the selected icon, as coordinate information, to an operation determination unit 105. Together with this coordinate information, identification information of the selected icon is also sent to the operation determination unit 105. It should be noted that an input device is a touch-panel input device placed on the display unit 103 that is implemented on the information processing device 100, or a mouse or trackball allowing cursor manipulation.
The display control unit 10 controls a displaying status of the display unit 103 according to the user operation received by the instruction receiving unit 104. More specifically, the display control unit 10 first causes at least one object icon to be displayed in a first display area of the display unit 103.
Next, when the instruction receiving unit 104 receives an operation in which at least one of at least one object icon displayed in the first display area is selected, the display control unit 10 causes at least one command icon to be displayed in a second display area that is different from the first display area. Note that the display control unit 10 extracts, from the command management unit 101, only a command executable on the operation target object identified by the selected object icon, and then causes a command icon identifying the extracted command to be displayed. Here, a detail displayed in the first display area of the display unit 103 is maintained as it is.
Then, when the instruction receiving unit 104 receives an operation in which at least one of at least one command icon displayed in the second display area is selected, the display control unit 10 causes at least one parameter icon to be displayed in a third display area that is different from the first and second display areas. Note that the display control unit 10 extracts, from the command management unit 101, only a parameter associated with the command identified by the selected command icon, and then causes a parameter icon identifying the extracted parameter to be displayed. Here, details displayed in the first and second display areas on the display screen are maintained as they are.
Moreover, the display control unit 10 extracts a processing pattern executable on the operation target object, from the pattern management unit 110. Then, the display control unit 10 causes a processing pattern icon identifying the extracted processing pattern to be displayed in a pattern display area that is different from the first, second, and third display areas of the display unit 103.
In order to execute the above process, the display control unit 10 includes the operation determination unit 105, an attribute verification unit 108, and an operation area management unit 109.
The operation determination unit 105 determines, for example, whether or not coordinates indicated by the coordinate information received from the instruction receiving unit 104 overlap with a command icon representing the corresponding command. More specifically, the operation determination unit 105 determines whether or not the coordinates indicated by the coordinate information received from the instruction receiving unit 104 coincide with the position of the command icon.
This determination is repeated at predetermined time intervals. Then, the operation determination unit 105 sends identification information of the command icon located at a position that is specified by the coordinates indicated by the coordinate information received from the instruction receiving unit 104, to the command management unit 101. The operation determination unit 105 performs likewise when the object icon or the parameter icon is selected.
The attribute verification unit 108 verifies attributes associated with the identification information of the operation target object and the identification information of the processing command that are received from the command management unit 101. Then, the attribute verification unit 108 determines the details to be next displayed on the display unit 103. For example, according to the attribute of the operation target object, the attribute verification unit 108 extracts the processing command executable on the present operation target object from the command management unit 101. Also, according to the attribute of the selected processing command, the attribute verification unit 108 extracts a parameter associated with the present processing command from the command management unit 101.
The operation area management unit 109 defines command icons, as a group, that represent commands having the common attribute. Moreover, the operation area management unit 109 manages an icon display area of the display unit 103.
The timer 107 measures a period of time elapsed from when the operation determination unit 105 determines that the coordinates indicated by the coordinate information received from the instruction receiving unit 104 coincide within the icon to when the operation determination unit 105 determines that the coordinates have moved away from the icon. This elapsed period of time is managed for each icon.
When the duration of stay measured by the timer 107 becomes equal to or longer than a predetermined threshold, the operation determination unit 105 determines that the icon at which the present duration of stay has been measured is selected. Then, the operation determination unit 105 sends the identification information of the selected icon to the attribute verification unit 108 and the instruction generation unit 115. Until receiving an execute instruction (such as execute, pattern save, and cancel) from the user, the operation determination unit 105 sends the identification information pieces of the selected icons one by one in the order in which their durations of stay become equal to or longer than the threshold. Processing sequence data including the identification information pieces of the icons stored in the instruction generation unit 115 is sent to the instruction execution unit 106 when an execute instruction is received from user (that is, when the icon representing the execute instruction is selected).
In this way, the operation determination unit 105 and the timer 107 function as duration-of-stay management units that manage the duration of stay measured from when the dragged object icon overlays a command icon or a parameter icon to when the dragged object icon moves away from the command icon or the parameter icon.
As a specific example of the operation in which a command icon or a parameter icon is selected, an operation of overlaying an object icon on the command icon or the parameter icon (namely, a drag operation) is explained as follows. However, note that the present invention is not limited to this. For example, an icon may be selected by pressing the present icon (through a click operation or a touch operation).
According to the operation of the user received by the instruction receiving unit 104, the instruction generation unit 115 generates an instruction which is a group of commands and parameters stored in the command management unit 101. To be more specific, the instruction generation unit 115 generates the processing sequence data by arranging the identification information pieces of the icons obtained from the operation determination unit 105 in the order in which the icons were selected. Then, when the execute instruction is received from the user, the instruction generation unit 115 sends the present processing sequence data to the instruction execution unit 106.
The instruction execution unit 106 obtains the identification information of the operation target object and the processing sequence data from the instruction generation unit 115. Then, the instruction execution unit 106 executes the commands, corresponding to the command identification information pieces included in the processing sequence data, on the received operation target object in the order in which the commands are stored as mentioned above. Here, the instruction execution unit 106 receives a command corresponding to the command identification information from the command management unit 101, and executes the process on the operation target object using this command.
Next, an operation performed by the information processing device 100 having the above configuration is explained.
It should be noted that object icons which are not being displayed are managed as, for example, data connected on a continuous chain. Moreover, these other object icons which are not being displayed can be displayed by touching the left or right arrow button area shown in
An area 202 (Area-2) displays a function list showing a command icon that represents a command (or, function) executable on the operation target object selected in the area 201. In the area 202, command icons of up to the maximum number (six icons F1 to F6 in the present example) are displayed according to display layout information managed by the operation area management unit 109.
An area 203 (Area-3) displays a property list showing a parameter icon that represents a parameter associated with the process corresponding to the command icon selected in the area 202. In the area 203, parameter icons of up to the maximum number (four icons D1 to D4 in the present example) are displayed according to the display layout information managed by the operation area management unit 109.
An area 204 (Area-4) refers to an area unused for displaying (or, a blank area), aside from the areas 201 to 203 already used for display and an area 205 (Area-5) serving as a reserved area arranged for a process execute operation (that is, an icon represented by Execute) or a process cancel operation (that is, an icon represented by Cancel).
The operation area management unit 109 manages operation area information that includes a displaying status or an area occupancy rate required for display, for each of the areas 201 to 205. Also, the operation area management unit 109 may manage guides such as dotted lines or frames that allow the operation areas to be recognized as separate groups. The guides include L1 to L4 shown in
In the following, an operation performed by the information processing device 100 is explained in detail using
First, as shown in
On the other hand, when the current mode is not the operation-target-object selection mode (N in S101), the instruction receiving unit 104 stands by until a next touch-operation event is detected (S102). When the current mode is not the operation-target-object selection mode, this means, for example, that an operation lock has been applied for reasons such as that the user has not performed any operation on the operation screen for a predetermined period of time.
Next, the operation determination unit 105 determines whether or not the operation instruction notified by the instruction receiving unit 104 is entered through a drag operation (S103). When it is entered through the drag operation (Y in S103), a subsequent process is performed. On the other hand, when the operation instruction is not entered through the drag operation (N in S103), a different process is executed such as a process of simply selecting the operation target object by a single touch or double click or a process of directly executing a functional process connected to the operation target object (S102).
Next, the operation determination unit 105 determines whether the operation target object is already decided (S104). When the object is already decided (Y in S104), a subsequent process is performed. On the other hand, when the operation target object is yet to be decided (N in S104), the operation determination unit 105 determines whether the drag operation detected in S103 was performed on an object icon representing the operation target object (that is, one of the object icons C-1 to C-3 drawn in the area 201 in
On the other hand, when it is determined not to be a start of the drag operation performed on an object icon (N in S105), the selection of the operation target object becomes invalid. In this case too, the instruction receiving unit 104 stands by until a next touch-operation event is detected (S102).
When determining that the drag operation was performed on the object icon (Y in S105), the operation determination unit 105 registers at least one selected object icon (C-2 in the present example) into the operation history table held by the instruction generation unit 115 (S106). More specifically, as shown in
However, as shown in
Moreover, the operation determination unit 105 determines whether the drag operation performed by the user is ended (S107). When the drag operation is not ended (N in S107), a subsequent process is performed. The operation determination unit 105 determines whether the center point of the selected object icon is dragged outside the display area (i.e., the area 201 shown in
When the center point of the object icon crosses the guide showing the boundary of the display area (Y in S108) as shown in
Next, based on the result of the object attribute verification process, the attribute verification unit 108 verifies whether there is an executable command or a registered processing pattern associated with the operation target object represented by the object icon that is currently being selected (S110). When there is an executable command or a registered processing pattern (Y in S110), the display control unit 10 displays a command icon “Fn” representing this command or a processing pattern icon “Rn” representing this processing pattern, on the display unit 103 (see
When the drag operation performed by the user is relatively long and the attribute of the operation target object changes during the drag operation or when another object icon is selected by multi-touch, the display control unit 10 updates the already-displayed command icon “Fn” or processing pattern icon “Rn”.
According to the selected operation target object (C-2), the display unit 103 displays command icons, out of command icons representing the commands shown in
As described thus far, the user can visually designate a command process by performing the drag operation so as to overlay the object icon representing the operation target object onto the command icon representing the command, as shown in
Next, before S112 and S113 are explained, processes performed from S114 are explained. Suppose here that the drag operation is continuously performed by the user on the object icon representing the operation target object as shown in
Also, when there are parameters associated with the selected command icon “Fn”, the display control unit 10 displays parameter icons Dn as shown in
Moreover, suppose here that the drag operation is continuously performed by the user on the object icon representing the operation target object as shown in
Next, as shown in
Then, when the sequential drag operation by the user is ended (Y in S107) as shown in
When it is determined that the reserved area is being displayed (Y in S119), a position of the center point of the object icon on the operation screen at the end of the drag operation is determined (S120). To be more specific, when the center point is located on the execute instruction icon in the reserved area (Y in S120), a subsequent process is performed. On the other hand, when the center point is not located on the execute instruction icon (N in S120) and is on the save instruction icon (Y in S121), a process of S122 is performed.
When the center point is not located on the execute instruction icon nor the save instruction icon (N in S121), it is determined that, for example, information required to execute the processing command associated with the operation target object currently being selected is insufficient. Accordingly, the operation-target-object selection mode is cancelled and the information processing device 100 enters a standby state (S102).
When it is determined that the drag operation performed on the object icon representing the operation target object is ended on the save instruction icon (Y in S121), the instruction generation unit 115 registers the command and parameters held in the operation management table as the processing pattern “Rn” (S122).
Moreover, suppose that it is determined that the center point of the object icon is located on the execute instruction icon in the reserved area at the end of the drag operation (Y in S120) or that the execution of the process in S122 is ended. In this case, the instruction execution unit 106 executes the process corresponding to the command registered in the operation history table, according to a designated parameter (S123).
Here, when “Progress view mode” in which the processing progress can be visually confirmed is set in advance in a setting screen or the like of the information processing device 100 (Y in S124), a process execution status is displayed on the display unit 103 (S125). It should be noted that S125 is explained later using a specific example.
On the other hand, when “Progress view mode” is not set (N in S124), the screen of the display unit 103 does not change.
Moreover, in the case where a clear button used for instructing an interruption or cancel of the process is provided on a part of the body of the information processing device 100, the process currently being executed is cancelled when this clear button is pressed. More specifically, when the user presses the clear button provided on the body (Y in S126), an event notification indicating a cancel instruction is caused in the information processing device 100. Then, the instruction execution unit 106 receives the caused notification indicating the cancel instruction, interrupts the process currently being executed, and executes a process (namely, a recovery process) to recover the process interrupted in midstream (S127).
On the other hand, when the execution of the sequential processes registered in the operation history table is ended without a cancel instruction being caused (N in S126), the instruction execution unit 106 determines that the process execution by the instruction execution unit 106 has normally completed and thus performs an end process (S128). Then, in order to receive a next operation from the user, the information processing device 100 enters the standby state (S102). It should be noted that the end process refers to, for example, a process of deleting the registered details in the operation history table.
In this way, through the processes shown in
Next, the object attribute verification process (S109) is explained in detail, using
As described above, when the center point of the object icon crosses the guide showing the boundary of the display area (Y in S108) as shown in
In response to the request from the operation determination unit 105, the attribute verification unit 108 initializes an object counter used for counting the number of objects which are currently operation targets and a profile counter used for counting the number of profiles associated with the object (that is, each counter becomes 0) (S201). Moreover, as an initialization process, the attribute verification unit 108 clears all in the command candidate table. It should be noted that the operation target object is data or a command that is designated as a process target by an operation instruction.
Next, the attribute verification unit 108 obtains the identification information (C-2) of the operation target object with reference to an operation target object “On” registered in the operation history table shown in
In this example, the selected operation target object is only the identification information (C-2). On this account, the profile of the operation target object is only the profile “Image” referenced with C-2. On the other hand, suppose that a plurality of object icons are selected using multi-touch. In such a case, a plurality of profiles may be selected from the identification information pieces indicating the plurality of operation target objects. Thus, the attribute verification unit 108 obtains all the profiles (S202).
Next, the attribute verification unit 108 inquires of the command management unit 101 whether there are executable processing commands associated with the obtained profile names. In response to the request from the attribute verification unit 108, the command management unit 101 sends a group of executable commands with reference to the command management table shown in
Then, when there are processing commands associated with the profile name (Y in S204), the attribute verification unit 108 extracts the processing commands associated with the profile and attempts to store the commands into the command candidate table (S205). Here, suppose that the extracted processing commands are already registered in the command candidate table and also “Filter display mode” in which only shared common functions (processing commands) can be visually confirmed is set in advance in the setting screen of the information processing device 100 (Y in S206). In such a case, only the common processing commands are registered in the command candidate table (S207). Accordingly, even when a plurality of operation target objects are selected, only the command icons representing the common processing commands are displayed in a menu. This can prevent a complicated and redundant menu from being displayed.
Moreover, when the number of processing commands associated with the profile is smaller than the maximum number of commands storable in the command candidate table (ten commands in the present example) (N in S208), the attribute verification unit 108 registers the extracted processing commands as the processing candidates directly into the command candidate table (S209). On the other hand, when the number of processing commands associated with the profile is equal to or larger than the maximum number of commands storable in the command candidate table (Y in S208), the attribute verification unit 108 registers the commands at the maximum in descending order of frequency of use on the basis of the command frequencies indicated by “Rate” in
It should be noted that a column indicated by “Com.” in
When it is determined that there are no processing commands associated with the obtained profile in the command management table (N in S204), the attribute verification unit 108 performs the processes from S209 or from S210.
Next, the attribute verification unit 108 verifies whether the processing commands associated with all the profiles of the operation target object currently being verified are extracted (S211). When not all the profiles are verified (N in S211), the attribute verification unit 108 increments the profile counter (S212) and then returns to S203.
On the other hand, when it is determined that all the profiles corresponding to the selected operation target object are verified (Y in S211), the attribute verification unit 108 initializes the profile counter to 0 (S213).
Furthermore, the attribute verification unit 108 checks whether all the operation target objects other than the operation target object currently being verified have been verified (S214). When not all the objects are verified (N in S214), the attribute verification unit 108 increments the object counter (S215) and then returns to S202.
On the other hand, when determining that all the operation target objects have been verified (Y in S214), the attribute verification unit 108 terminates the object attribute verification process and then returns to S110 in
The object attribute verification process described above allows the processing commands to be dynamically extracted corresponding to the profile of the operation target object. Accordingly, a processing command area (namely, the menu) can be dynamically changed.
Next, the operation to register a processing pattern and the process execution using the registered processing pattern are explained, with reference to
When it is determined that the drag operation is ended on the save instruction icon (Y in S121), the operation determination unit 105 requests the pattern management unit 110 to register the processing sequence data registered in the operation history table, as “Processing pattern Rn” (S122). The pattern management unit 110 newly creates a processing pattern with reference to the designated operation history table and then saves the processing pattern into the pattern management table. In the present example, the pattern management unit 110 interprets the operation history table shown in
Here, the information processing unit 100 abstracts the identification information (C-2) of the operation target object and replaces the identification information with the type information (Image) of the operation target object. Thus, when this processing pattern is executed in the future, whether this processing pattern is executable can be determined by determining the type information.
As shown in
With this, when a next processing command is selected in the future, the user can drag the object icon representing the operation target object onto the processing pattern icon representing the registered processing pattern. As a result, the registered processing pattern can be executed (this corresponds to the processes from S120).
Note that the processing pattern represented by the pattern ID “R2” in
On the other hand, when determining that the object icon did not pass through the area within the predetermined period of time (N in S112), the operation determination unit 105 performs the processes from S114. In this way, it is clearly shown to the user that the command icons included in the operation area through which the object icon passed in a short period of time by the drag operation are not to be actually selected by the user and therefore are not to be selection candidates. Also, the user can be prevented from touching and thus selecting an undesired processing command by mistake during the drag operation.
The present example describes the case where the determination regarding the transit is made based on the fixed predetermined period of time. However, the size of the operation area is not always the same and, for this reason, a period of time in which the selected icon is dragged through the transit area is to be changed. Thus, the predetermined period of time may be changed dynamically according to the size of the entire screen of the display unit 103, the maximum displayable number of processing commands that can be stored in the operation area managed by the operation area management unit 109, or the size of the operation area included in the layout information.
Moreover, when the processing pattern icon is selected as shown in
Furthermore, when the user selects the processing pattern icon and ends the drag operation on the execute instruction icon (S120) as shown in
According to the first embodiment described thus far, the operation determination unit 105 included in the information processing device 100 detects the operation target object selected on the operation screen by manipulating the cursor, and also detects the processing command to be executed on the selected operation target object. The instruction generation unit 115 generates an operation history table in which an operation target object, a processing command associated with the operation target object, and a parameter associated with the processing command are stored in association with one another. Also, at least one object icon, a command icon identifying a processing command executable on the selected operation target object, and a parameter icon identifying a parameter associated with the processing command are displayed on the same operation screen.
Moreover, the instruction generation unit 115 arranges the identification information pieces of the processing commands in the order of detection that are detected by the operation determination unit 105 during the drag operation. This drag operation is performed in a single movement without a release from an act of selection, from when the operation target object is selected to when a group of processing commands to be executed on the present operation target object is determined. As a result, the processing sequence data is generated. On the basis of the identification information pieces in the processing sequence data, the instruction execution unit 106 executes the processing commands on the selected operation target object.
With this, the information processing device 100 can display only a menu that is available to the selected operation target object and a subordinate menu that is available after the execution of the process corresponding to the menu item selected from the present available menu. Moreover, the information processing device 100 allows a plurality of desired processes to be designated and executed only by tracing on the display menu through a drag operation. In other words, this can implement the information processing device 100 which is capable of executing a plurality of processes with intuitive operability.
The operation determination unit 105 and the timer 107 serving as the duration-of-stay management units manage the duration of stay from when the object icon representing the selected operation target object is partially overlaid on the command icon to when this object icon is moved away from the command icon. Then, only when the duration of stay on this icon exceeds the predetermined threshold, the operation determination unit 105 detects the command represented by the command icon as the selected command.
With this, even when the user touches an undesired menu item for a moment due to an unintentional hand movement or an operational mistake during the drag operation, this menu item is not considered to be selected. This can prevent a menu item that is not desired by the user from being executed.
The pattern management unit 110 manages the processing sequence data generated in the past, as the processing pattern. With this, when the same processes are to be executed, the corresponding commands do not need to be selected. This results in improved convenience to the user.
It should be noted that even when there is no duration between the process designation and the process execution, the display control unit 10 may display the processing sequence data on the screen in order for the user to check whether there is no mistake in the displayed processing sequence data, and the process may be executed after the execute instruction is received from the user. This can prevent a process which is not desired by the user from being executed. Also, this can prevent degradation from occurring to the operational performance (or, response) in the continuous drag operation. Here, the degradation may occur due to an increased load on the terminal that executes a background process to achieve immediate execution.
Moreover, when two operation target objects are selected at the same time, the display control unit 10 displays only the commands common to both of these operation target objects. Note that the display control unit 10 performs likewise when more than two operation target objects are selected at the same time.
Since the menu items which cannot be executed on the two operation target objects at the same time are not displayed, the user can be prevented from making, for example, an operational mistake. This results in improved convenience to the user.
The operation area management unit 109 manages group areas in which operation target objects, commands, and parameters are separately arranged as groups respectively, and displays boundaries between adjacent areas as guides. Thus, the user can intuitively determine whether the menu includes a common attribute, meaning that convenience to the user can be improved.
It should be noted that the operation determination unit 105 may determine the aforementioned predetermined threshold according to the size or width of a group area. With this, for the individual information processing devices 100 each having a different screen size, a flexible threshold can be set according to a moving distance in a required drag operation, in consideration with the sizes of the menu display areas that vary depending on the screen size. Therefore, operational mistakes can be reduced, and the operability can be improved.
The second embodiment relates to a system which executes a process in cooperation with external devices connected to an information processing device via a network.
When the command management table held by the command management unit 101 does not store a profile of the external device which is the operation target object, the attribute verification unit 108 obtains the profile from the external device. Here, the profile is obtained via the communication unit 111. It should be noted that, instead of attempting to obtain information on the profile level, the display control unit 10 may obtain, from the external device, the command management table or function list that includes the profile.
The instruction generation unit 115 sends the generated processing sequence data together with a process execution request, to the external device which is the operation target object. Also, the instruction generation unit 115 sends the processing pattern held by the pattern management unit 110 together with the process execution request, to the external device.
The process execution request is sent via the communication unit 111. Note that the external device can exchange information with the information processing device 100A by communication via the network 250, using a standard device-cooperation protocol such as UPnP (Universal Plug and Play) or ECHONET (Energy Conservation and Homecare Network) or using a uniquely designated protocol.
Next, an operation of the system having the configuration described above is explained. The explanation is given, based on the following premise. That is, the user operates the first TV 400 capable of receiving digital terrestrial broadcasting (D) and satellite broadcasting (BS) and the second TV 401 (a set-top box) capable of receiving cablecast (CATV), using the information processing device 100A.
The diagrams and explanations given below describe a case in particular where Channel 8 of digital terrestrial broadcasting is selected in a tuner of the first TV 400, Channel 102 of CATV is selected in a tuner of the second TV 401, and these two received pictures are displayed on the first TV 400 in a dual split-screen mode in a ratio of three to one.
In
Each object icon (also referred to as a “device icon” when the icon represents an external device) displayed in Area-1 represents an external device. In the first embodiment, the contents are the operation target objects. However, in the second embodiment, the external devices are the operation target objects and thus the targets as the operation objects are different. Regardless of the difference, the contents and the external devices can be processed as the operation target objects in the same manner.
More specifically, by touching the device icons (the TV1 and the TV2 in the present example) on the operation screen and performing a drag operation, the user can select and then execute the processing command associated with the operation target objects according to the flowchart shown in
In response to the request from the attribute verification unit 108, the command management unit 101 sends a group of executable commands with reference to the command management table shown in
Therefore, it is determined that the profile of “TV” does not exist in the command management unit 101 (N in S204).
Here, the attribute verification unit 108 inquires of the external device, via the communication unit 111, whether there is (namely, whether the external device holds) command information corresponding to the profile of “TV” (S216). Then, the attribute verification unit 108 receives a result of the inquiry via the communication unit 111, and registers the received command information into the command management unit 101.
The attribute verification unit 108 determines whether or not the command information corresponding to the desired profile has been obtained (S217). When the command information has been obtained (Y in S217), the attribute verification unit 108 performs the processes from S205. On the other hand, when the command information corresponding to the desired profile has not been obtained (N in S217), the attribute verification unit 108 performs the processes from S209 or from S210.
Information 1502 to 1504 describe functions or services provided by the device. At the beginning and end of the information 1502 to 1504, device service tags (Device Services) are described. In other words, it is understood that a section between the device service tags describe the information regarding the functions or services provided by the device. In the present example, there are three elements as the functions and services provided by the device.
The information 1502 describes that: the TV1 has a tuner; the TV1 is capable of receiving digital terrestrial broadcasting; the ID identifying the function is “TV1-T1”; and the channel currently being set to receive in the tuner is “D-081”. The information 1503 describes that: the TV1 has a tuner capable of receiving BS broadcasting; the ID identifying the function is “TV1-T2”; and the channel currently being set is “BS-05”. The information 1504 describes that: the TV1 has an EPG (Electronic Program Guide) function; the ID identifying the function is “TV1-EPG”; and the current state is “Ready (Standby)”.
Information 1505 to 1507 describe information regarding commands that correspond to the functions or services provided by the device. At the beginning and end of the information 1505 to 1507, control list tags (Control List) are described. In other words, it is understood that a section between the control list tags describe the information regarding the processing commands that correspond to the functions or services provided by the device. In the present example, there are three elements as the processing commands that correspond to the functions and services provided by the device.
The information 1505 describes that channel change in ascending order (Channel Up) is provided as a command (action) executable on a target tuner function. To be more specific, the information 1505 describes that “Button: Up” is recommend as a form for displaying this command, and also describes a path required to actually execute the processing command as a URL (Uniform Resource Locator) of the target external device.
The information 1506 describes that channel change in descending order (Channel Down) is provided. To be more specific, the information 1506 describes that “Button: Down” is recommended as a form for displaying this command, and also describes a path required to execute the command. It should be noted that the tuner which is the target here is “Tuner” represented by the function type (Type tag) held by the TV1. Also note that the channel change is applicable to each of the tuners for receiving digital terrestrial broadcasting and BS broadcasting.
The information 1507 does not designate the target, and describes that there is a split-screen display (Dual Display) as a command that corresponds to a basic function of the TV1. To be more specific, the information 1507 describes that “(typical) Button” is recommended as a form and that the property to be displayed on this button is a split rate list (Rate List). Also, a URL required to obtain XML of the split rate list is described.
Moreover, a path required to actually execute the processing command is described as a URL of the target external device. Furthermore, the information 1507 describes how to designate the input sources for the two screens necessary for the dual display and how to designate the split rate.
In
By pressing (or touching) these up and down buttons, the user can change the channels of the tuners. Also, in
In
In
In the present example, out of the functions of the air conditioner (Profile: Appliance) and the TV (Profile: AV, TV), the processing commands “Volume”, “Mute”, “Channel UP”, and “Channel Down” which are not common to these operation target objects are deactivated (namely, grayed out).
A column indicated by “Com.” in
Since “Power” and “Timer” represent the commands which are executable on and common to the selected two devices, these commands are determined as common functions when a plurality of devices are selected using, for example, multi-touch. Accordingly, the attribute verification unit 108 activates, for display, the icons “Power” and “Timer” determined as the common functions executable on the selected operation target objects.
Here, in
In the command management table shown in
The command reference tables shown
Here, the state information of the operation-target air conditioner and TV is described in the reference table by using, for example, the power state tags (Power State) shown in the information 1501 in
Thus, on the basis of the operating state of the operation target object and the description in the command reference table, the attribute verification unit 108 causes the command icon representing the processing command “Power” to be displayed as “Power OFF”. In this way, according to the state of the operation target device, the command icons representing the inexecutable processing commands are not displayed, and only the command icons representing the executable processing commands are thus displayed. This can improve the efficiency of the user operation.
Moreover, since the unnecessary command icons can be prevented from being displayed, the command icons can be displayed without using a hierarchical structure. With this, as shown in
Note that, as the parameter icons in
When the user performs the operation in accordance with the procedure shown in
In the case where the progress view mode is set (Y in S124 in
Next, the process execution using the registered processing pattern is explained. The explanation is given in particular about a case, as an example, where detailed properties are set for a command execution time and for a command execution.
To be more specific, the user selects a desired processing pattern (Morning Bell), as the operation target object, from the list of the registered processing patterns displayed in an area corresponding to the area 201 in
Then, by the drag operation, the selected object icon: crosses the command icon representing “Daily” in Recurring Setting displayed in an area corresponding to the area 202 in
In this way, a command whose setting is to be changed is also included as the command associated with the processing pattern, so that the registered processing pattern including a plurality of commands that has been once registered can be changed or modified. Here, the processing pattern can be changed or modified through a drag operation of the user.
It should be noted that, in
According to the second embodiment as described thus far, at least one of the plurality of operation target objects is an external device of the information processing device 100A. The information processing device 100A has the communication unit 111 which communicates with the external device. When the external device is selected as the operation target object, the attribute verification unit 108 sends, via the communication unit 111, a message requesting the selected external device to send the corresponding processing command.
With this, the external device can be processed as is the case with the operation objects (contents) held by the information processing device 100A.
When the processing sequence data includes the identification information of the processing command to be executed by the external device, the instruction generation unit 115 sends this identification information to the external device via the communication unit 111.
The display correction unit 112 corrects displayed details according to an area which has yet to be used for displaying the operation target objects, the processing commands, or the like on the display screen (namely, this area is blank). To be more specific, the display correction unit 112 corrects spaces between a plurality of icons displayed on the display unit 103, according to the size of an area that remains unused for displaying icons on the display unit 103.
In the present example, the explanation is given based on the following premise. That is, the information processing device 100B can display icons representing external devices, namely, a hard disk recorder (PVR), a television (TV), and an electric pot (Pot), which are connected via a network 250.
Also, the explanation is based on the following premise. The user captures a recipe from a cooking program currently being shown on the TV as a snapshot (here, suppose that the TV is in a dual screen mode, and that the recipe is displayed on the left-hand screen (Tuner 1) out of the two screens). Then, when sending an email message with the captured image attached, the user resizes the image to QCIF size and selects “User C” as a destination.
The display control unit 10 causes the display unit 103 to display an operation screen as shown in the top portion of
A table shown in the bottom portion of
The following is a description about the process of correcting the displayed details, with reference to
The display correction unit 112 corrects each size of the display areas currently being displayed on the display unit 103 while the instruction receiving unit 104 is receiving an operation instruction from the user, so as to enhance ease-of-use for the user. The display correction unit 112 monitors the aforementioned occupation rate for each display area, and verifies the current size of the area unused for displaying (S301). This verification is performed when, for example, the user selects a command from the command area.
For example, when the user selects a detailed property (QCIF) of Resize for the image as shown in
Here, the display correction unit 112 determines whether the current occupation rate of the area unused for displaying is equal to or smaller than a predetermined threshold (in the present example, 20% as a fixed value) (S302). When the occupation rate is larger than the threshold (N in S302), the display correction unit 112 terminates the displayed-detail correction process.
On the other hand, when the occupation rate is equal to or smaller than the threshold (Y in S302), the display correction unit 112 determines whether a selected item (i.e., a part visible to the user that is displayed in the processing command area or the subcommand area) exists in the operation history table (S303). When the selected item does not exist (N in S303), the display correction unit 112 terminates the displayed-detail correction process. On the other hand, when the selected item exists (Y in S303), the display correction unit 112 clears the display of the bordered area (with guides indicated by dotted lines in
Moreover, when the display correction level of the area that corresponds to the area 2302a (Functions) is “0” (Y in S305), the display correction unit 112 sets each display margin above and below is the command icons or parameter icons at “0 (pixel)” (S306).
Then, the display correction unit 112 changes the display correction level of the display-corrected area 2302a to “1” (S307), revises the size and occupation rate of the area unused for displaying (S313), and then refreshes the operation screen according to the changed setting (S314). As a result of this, an area 2302b shows the compressed representation of the area for the selected items.
To be more specific, as shown in the top portion of
On the other hand, when the display correction level is not “0” (N in S305), the display correction unit 112 determines whether the display correction level is “1” (S308). When the display correction level is not “1” (N in S308), the display correction unit 112 determines that the display correction is impossible and thus terminates the displayed-detail correction process.
The top portion of
Here, the occupation rate (18% in the present example) of an area-unused-for-displaying 2304c in
For example, the display correction unit 112 sets each display margin above and below the command icons or parameter icons at “−10 (pixels)” (S309). As a result, as shown in the top portion of
Then, the display correction unit 112 changes the display correction level of the display-corrected area 2302c to “2” (S312) and performs the above-described processes from S313. As a result, an area 2302d of
According to the third embodiment described thus far, when new icons are displayed, the display correction unit 112 of the information processing device 100B corrects the display form of the icons currently being displayed to a compressed form, according to the size of the area that remains unused for displaying on the display screen.
With this, even on a narrow screen, a menu from which a next selection is to be made by a drag operation can be displayed. Thus, the drag operation can be performed continuously, thereby improving the operability.
The compressed form refers to a form in which the displayed icons are reduced in size or a form in which the displayed icons partially overlap one another.
The virtual position determination unit 113 allows an operation to be performed on an external device by imitating a display screen of the external device on the operation screen. To be more specific, the virtual position determination unit 113 receives display layout information indicating details displayed on the display screen of the external device from this operation target object via the communication unit 111. Then, the virtual position determination unit 113 causes the display details indicated by the received display layout information to be displayed on a virtual screen in a virtual screen display area which is different from the first, second, and third display areas of the display unit 103.
When the selected object icon is released inside the virtual screen, for example, the virtual position determination unit 113 detects this release position on the virtual screen. Also, the virtual position determination unit 113 converts release position information corresponding to the virtual screen into position information corresponding to the display screen of the external device. Although the above explanation is based on the premise that the object icon is dragged, the present invention is not limited to this. When a specific position on the virtual screen is selected with a click for example, the selected position on the virtual screen may be detected and also information about this selected position on the virtual screen may be converted into information about a position on the display screen of the external device. The same is applied to a fifth embodiment described later.
Also, the explanation is based on the following premise. The user selects an image (such as a family photograph) saved in the mobile terminal close at hand and has the selected image displayed on an arbitrary position on a screen of the TV. In doing so, the user adds, to the selected image, a message made up of a character string visible to other users (such as “Enjoy!”) and designates an expiration period (a month, for example) after which the display of the image and added message is canceled.
To be more specific, an area 2600 in
Moreover, the size of a presentation area (Full Screen) of the screen actually included in the operation target device (Target Device) is 960 pixels in width and 540 pixels in height. Also, X and Y coordinates represented as (0, 0) on coordinate axes X and Y of the screen is a starting point of this presentation area.
The structural information (i.e., the display layout information) of the virtual screen shown in
Then, by the drag operation, the selected object icon: crosses the message icon (“Enjoy!” in the present example) desired by the user from the message list displayed in an area 2602 (Area-2); crosses an icon representing the expiration period (“1 month” in the present example) desired by the user from properties of the display expiration periods displayed in an area 2603 (Area-3); and is finally released at an arbitrary position in the virtual screen display area (called “Position on message board”) of the operation target device, the virtual screen display area being considered as the execute instruction icon.
Here, the instruction receiving unit 104 of the information processing device 100C obtains drag-operation-end coordinates (104, 368) at which the drag operation is ended in the virtual screen display area. Then, the instruction receiving unit 104 sends information regarding the drag-operation-end coordinates together with the instruction for the drag operation, to the operation determination unit 105. The operation determination unit 105 determines that the drag operation is ended on the execute instruction icon (S120 in
The virtual position determination unit 113 connected to the operation determination unit 105 adjusts the information regarding the coordinates in the virtual screen display area that are obtained at the end of the drag operation, according to the display position or scaling rate of the virtual screen. Then, the virtual position determination unit 113 sends the adjusted coordinate information to the operation determination unit 105. In this coordinate information adjustment process, since the Y coordinate of the starting point in the virtual screen display area is 300, 300 is subtracted from the drag-operation-end coordinate in the direction of the Y-axis. Moreover, in this adjustment process, the coordinates are converted according to the scaling rate.
Accordingly, as the coordinate information after the adjustment process, the coordinates (312, 204) to be used eventually on the screen of the operation target device are obtained. The operation determination unit 105 sends the pre-adjustment coordinate information and the post-adjustment coordinate information, to the instruction generation unit 115. Then, the instruction generation unit 115 sends an operation instruction message with the post-adjustment coordinate information attached, to the operation target device via the communication unit 111.
Information 2606 and information 2607 describe additional information regarding the added image. To be more specific, in the information 2606, the message “Enjoy!” selected on the operation screen is designated between text tags (Description). Moreover, in the information 2606, information indicating the display expiration period “1 month” also selected on the operation screen is designated between display expiration tags (Expire).
In the information 2607, the coordinates (312, 204) calculated by the virtual position determination unit 113 for the operation target device are designated between display position tags (Position). When receiving this operation instruction message, the operation target device obtains the image according to the operation instruction and displays the image with the designated message. Also, the operation target device clears the display according to the designated display expiration period.
With the display like this, the positions of the contents currently being displayed can be recognized. Therefore, an operation mistake, such as overlaying new contents on the already-displayed contents and thus hiding these already-displayed contents, can be avoided.
According to the fourth embodiment described thus far, the virtual position determination unit 113 of the information processing device 100C displays, within the operation screen, the virtual screen that corresponds to the display screen of the external device. When the selected object icon is released inside the virtual screen display area, the virtual position determination unit 113 detects the release position on the virtual screen and also converts the release position information corresponding to the virtual screen into the position information corresponding to the display screen of the external device. Moreover, the virtual position determination unit 113 sends the position information together with the identification information of the processing command to be executed by the external device.
With this, a process to be executed by the external device connected via the network, such as pasting memo information on the display screen of the external device, can be designated on the display unit 103 of the information processing device 100C close at hand. Also, a position at which this designated process is to be executed on the display screen of the external device can be designated.
The sensor 114 detects held-state information that indicates a state (typically, the orientation of the display unit 103) of how the information processing device 100D is currently being held. The sensor 114 sends the detected held-state information to the display control unit 10. The sensor 114 may send the held-state information in response to a request from the display control unit 10, or may automatically send the held-state information whenever detecting a change in the orientation of the information processing device 100D. Receiving the held-state information, the display control unit 10 changes the display positions (namely, the layout) of the icons displayed on the display unit 103, according to the orientation of the display unit 103.
Also, the explanation is based on the following premise. The user selects a desired image from a plurality of images displayed on the TV which is the external device. Then, using the information processing device 100D at hand, the user adds a message (“Great!”) to the selected image and sets the display expiration period (“1 month”). After this, the user re-pastes the image at an arbitrary position on the display screen of the TV.
For example, the size of a presentation area (Full Screen) of the screen included in the information processing device 100D (Source Device) is 320 pixels in width and 480 pixels in height. Also, X and Y coordinates represented as (0, 0) on coordinate axes X and Y of the screen is a starting point of the presentation area.
To be more specific, an area 2800 in
Each of an area 2804 (Area-4) in
The size of a presentation area (Full Screen) of the screen actually included in the operation target device (Target Device) is 960 pixels in width and 540 pixels in height. Also, X and Y coordinates represented as (0, 0) on coordinate axes X and Y of the screen is a starting point of this presentation area.
The structural information of the virtual screen shown in
Before the display control unit 10 causes the display unit 103 to present the operation screen, the sensor 114 detects the held-state information (including the orientation, bearing, tilt against gravity called device orientation) of the information processing device 100D in response to a request from the display control unit 10.
Then, the sensor 114 sends the detected current held-state information (the device orientation, in the present example) to the display control unit 10. On the basis of the held-state information received from the sensor 114, the display control unit 10 changes the display mode. In other words, the display control unit 10 determines the held state of the information processing device 100D according to the held-state information received from the sensor 114.
When determining that the current device orientation is “Vertical” as a result of the held-state determination, the display control unit 10 switches the mode of the operation screen to “Vertical mode”. More specifically, the operation screen in “Vertical mode” as shown in
In the case of the operation screen in “Horizontal mode”, the user can, for example, visually check the screen as shown in
Here, when the user turns the mobile terminal during the operation into a vertical position after selecting the image (that is, the user turns the terminal into a position where the aspect ratio of the screen is different), the sensor 114 detects a change in the device orientation of the mobile terminal. Then, as shown in
Then, by a drag operation of the user, the desired content (the image M6 in the present example): crosses the message icon (“Great!” in the present example) desired by the user from the message list displayed in an area 2802 (Area-2); crosses an icon representing the expiration period (“1 month” in the present example) desired by the user from properties of the display expiration periods displayed in an area 2803 (Area-3); and is finally released to end the drag operation at an arbitrary position in the virtual screen display area (namely, “Position on message board” in
In the present embodiment, the explanation has been given about the case where the display mode is switched according to the held-state information. However, note that the present invention is not limited to this. For instance, the display mode may be switched according to external temperature, body temperature, location information, and state information such as the remaining charge of the battery. The state information includes the held-state information as well. For example, a menu that encourages healthcare (such as a menu for setting humidification or temperature of an air conditioner) can be displayed preferentially according to the external temperature or body temperature. Moreover, a map to be displayed during the drag operation can be changed to a neighborhood map according to the location information. Furthermore, according to the remaining charge of the battery, nearby shops that provide battery-charging service can be displayed. The state information other than the held-state information is also detected by the sensor 114.
In the present embodiment, the explanation has been given about the case where the layout of the operation screen is changed when the display mode is switched. However, the present invention is not limited to this. The overall function of the operation menu or the displayed details and presentation manner of various functions may be changed according to the orientation of the terminal. For example, the operation screen may be changed following that “Device at hand is operated in the vertical mode” and “Connected device is so operated in the horizontal mode”.
According to the fifth embodiment described thus far, the sensor 114 of the information processing device 100D detects the state of the information processing device 100D. Then, on the basis of the state information detected by the sensor 114, the display control unit 10 switches the display mode.
With this, the operation screen is switched according to the state of the information processing device 100D, thereby improving the operability of the user.
(1) On each of the operation screens of the information processing devices 100, 100A, 100B, 100C, and 100D in the first to fifth embodiments, the object icons, command icons, and parameter icons which are not being displayed due to the screen layout can be displayed by touching the arrow button areas.
In the second to fifth embodiments in which the information is processing devices 100A, 100B, 100C, and 100D are connected to the external devices, the icons which are being displayed may be displayed on a large screen included in the external device.
A frame 2910 in
In the frame 2911, the menu items which are hidden in the information processing device 100C or 100D whose screen is smaller than the screen of the external device are displayed. This allows the user to visually check the hidden menu items on the screen of the TV which is the external device, while manipulating the information processing device 100C or 100D close at hand. Therefore, when a desired menu item is not displayed on the operation screen of the information processing device 100C or 100D, the user can execute the operation to display the desired menu item (such as “Memo” in the area 2914) checked on the screen of the external device, through the shortest process (For displaying “Memo”, the number of times the screen needs to be refreshed is less when scrolling leftward instead of rightward).
In the present example, the menu items to be selected are text and emoticons (which are character strings representing facial expressions using certain character combinations because of their physical appearances). Note that the menu items may be still images, moving images, and symbols. Although the scroll operation is performed leftward and rightward, the operation may be performed upward and downward or the 3D representation may be used for presenting the display. Moreover, as shown in the area 2916, hidden picture objects, instead of the menu items, may be displayed supplementarily (for example, the object may be a resident pasting-type application called a widget including a weather forecast, a clock, and a calendar, or may be a memo or video message pasted on a message board).
The hidden menu items can be displayed on the external device at the following timings. As a first example, in the case where the user is to perform a continuous drag operation, the hidden menu items may be displayed until the drag operation is started. As a second example, the hidden menu items may be displayed whenever the drag operation progresses (such as whenever the selected object icon crosses the boundary between the adjacent areas). As a third example, the hidden menu items may be displayed when the sensor detects a movement, such as a tilt of the information processing device 100D intentionally caused by the user.
(2) In the first to fifth embodiments, the path taken by the object icon in the drag operation is shown by the solid-line arrow on the operation screen displayed on the display unit 103. However, the method of displaying the path is not limited to this. For example, an operational guidance for assisting the operation may be displayed.
(3) In the first to fifth embodiments, whether the object icon and the command icon overlap one another is determined based on whether the center point of the object icon overlaps with the command icon. The base for the determination is not limited to this. For example, the determination may be made based on whether an edge (that is, a border) of the object icon has come into contact with the command icon.
(4) In the second to fifth embodiment, information sent to and received from the external device is abstracted and expressed using XML (Extensible Markup Language) format. The number of elements may be increased or decreased. Also, the description format may be a different format, such as binary format.
(5) In the first to fifth embodiments, the explanation has been given on the premise that two icons are dragged continuously to the execute instruction area using two fingers. However, after the two icons start moving due to the drag operation, the two icons may be unified and thus the drag operation can be continued even when the user moves one of the fingers off the screen.
(6) Each of the information processing devices 100, 100A, 1006, 100C, and 100D in the first to fifth embodiments has the screen for displaying the operation candidates corresponding to the various kinds of functions (including: editing and processing of a target image or target video; adjustments to image quality, sound quality, volume of sound in broadcasting and image reproduction; management, entry, reference, and display expiration period setting for the schedule; setting of timer viewing and recording; switching of the recording mode; and creation of mail text and address specification). The information processing device can be applied to a device, such as a mobile terminal, a personal computer, or a car navigation system, that executes, on a standalone basis, a process selected according to an operation instruction via an input device included in the device. Or, the information processing device can be applied to a remote control (referred to as the remote hereafter) of a TV set, a hard disk recorder, an air conditioner, or the like that executes a process selected according to an operation instruction via the remote provided separately from the information processing device.
Also, to be more specific, each of the information processing devices 100, 100A, 100B, 100C, and 100D in the first to fifth embodiments is a computer system configured by a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and so forth. The RAM or the hard disk unit stores computer programs. The microprocessor operates according to the computer programs, so that the functions of the components included in the computer system are carried out. A computer program includes a plurality of instruction codes indicating instructions to be given to the computer so as to achieve a specific function.
(7) Some or all of the components included in each of the information processing devices 100, 100A, 1008, 100C, and 100D in the first to fifth embodiments may be realized as a single system LSI (Large Scale Integration). The system LSI is a super multifunctional LSI manufactured by integrating a plurality of components onto a signal chip. To be more specific, the system LSI is a computer system configured by a microprocessor, a ROM, a RAM, and so forth. The RAM stores computer programs. The microprocessor operates according to the computer programs, so that the functions of the system LSI are carried out.
(8) Some or all of the components included in each of the information processing devices 100, 100A, 100B, 100C, and 100D in the first to fifth embodiments may be implemented as an IC card or a standalone module that can be inserted into and removed from the main body of the information processing device 100, 100A, 1008, 100C, or 100D. The IC card or the module is a computer system configured by a microprocessor, a ROM, a RAM, and so forth. The IC card or the module may include the aforementioned super multifunctional LSI. The microprocessor operates according to the computer programs, so that the functions of the IC card or the module are carried out. The IC card or the module may be tamper resistant.
(9) The first to fifth embodiments describe the example where the present invention is configured by hardware. However, the present invention can be implemented as software.
The information processing device according to the present invention is capable of executing a plurality of processes with intuitive operability, and is useful when applied to a remote control used for operating a device, such as a mobile terminal, a personal computer, a car navigation system, a digital TV set, a digital video disk recorder, a set-top box, a projector, or an external monitor.
Patent | Priority | Assignee | Title |
10019144, | Feb 15 2013 | QUICK EYE TECHNOLOGIES INC | Organizer for data that is subject to multiple criteria |
10078420, | Mar 16 2012 | Nokia Technologies Oy | Electronic devices, associated apparatus and methods |
10079892, | Apr 16 2010 | ARLINGTON TECHNOLOGIES, LLC | System and method for suggesting automated assistants based on a similarity vector in a graphical user interface for managing communication sessions |
10162452, | Aug 10 2015 | Apple Inc | Devices and methods for processing touch inputs based on their intensities |
10180772, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10235035, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
10248308, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interfaces with physical gestures |
10268341, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10268342, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10303354, | Jun 07 2015 | Apple Inc | Devices and methods for navigating between user interfaces |
10338772, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10346030, | Jun 07 2015 | Apple Inc | Devices and methods for navigating between user interfaces |
10387029, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for displaying and using menus |
10402073, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
10416800, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for adjusting user interface objects |
10437333, | Dec 29 2012 | Apple Inc. | Device, method, and graphical user interface for forgoing generation of tactile output for a multi-contact gesture |
10455146, | Jun 07 2015 | Apple Inc | Devices and methods for capturing and interacting with enhanced digital images |
10481690, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for media adjustment operations performed in a user interface |
10496260, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for pressure-based alteration of controls in a user interface |
10592041, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
10599331, | Mar 19 2015 | Apple Inc | Touch input cursor manipulation |
10620781, | Dec 29 2012 | Apple Inc. | Device, method, and graphical user interface for moving a cursor according to a change in an appearance of a control icon with simulated three-dimensional characteristics |
10698598, | Aug 10 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10705718, | Jun 07 2015 | Apple Inc. | Devices and methods for navigating between user interfaces |
10754542, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10775999, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
10782871, | May 09 2012 | Apple Inc | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
10841484, | Jun 07 2015 | Apple Inc | Devices and methods for capturing and interacting with enhanced digital images |
10860177, | Mar 08 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10884591, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for selecting object within a group of objects |
10884608, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for content navigation and manipulation |
10908808, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for displaying additional information in response to a user contact |
10915243, | Dec 29 2012 | Apple Inc. | Device, method, and graphical user interface for adjusting content selection |
10942570, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
10963158, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
10969945, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for selecting user interface objects |
10996788, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for transitioning between display states in response to a gesture |
11010027, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for manipulating framed graphical objects |
11054990, | Mar 19 2015 | Apple Inc. | Touch input cursor manipulation |
11068153, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for displaying user interface objects corresponding to an application |
11112957, | Mar 08 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
11112961, | Dec 19 2017 | Sony Corporation | Information processing system, information processing method, and program for object transfer between devices |
11182017, | Aug 10 2015 | Apple Inc | Devices and methods for processing touch inputs based on their intensities |
11221675, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
11240424, | Jun 07 2015 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
11314407, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
11327648, | Aug 10 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
11354033, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for managing icons in a user interface region |
11550471, | Mar 19 2015 | Apple Inc. | Touch input cursor manipulation |
11681429, | Jun 07 2015 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
11740785, | Aug 10 2015 | Apple Inc | Devices, methods, and graphical user interfaces for manipulating user interface objects with visual and/or haptic feedback |
11835985, | Jun 07 2015 | Apple Inc. | Devices and methods for capturing and interacting with enhanced digital images |
11947724, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing tactile feedback for operations performed in a user interface |
11977726, | Mar 08 2015 | Apple Inc. | Devices, methods, and graphical user interfaces for interacting with a control object while dragging another object |
12067229, | May 09 2012 | Apple Inc. | Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object |
12135871, | Dec 29 2012 | Apple Inc. | Device, method, and graphical user interface for switching between user interfaces |
8826169, | Jun 04 2012 | Amazon Technologies, Inc | Hiding content of a digital content item |
8838450, | Jun 18 2009 | Amazon Technologies, Inc. | Presentation of written works based on character identities and attributes |
8887044, | Jun 27 2012 | Amazon Technologies, Inc | Visually distinguishing portions of content |
9298699, | Jun 18 2009 | Amazon Technologies, Inc. | Presentation of written works based on character identities and attributes |
9418654, | Jun 18 2009 | Amazon Technologies, Inc. | Presentation of written works based on character identities and attributes |
9693192, | Dec 28 2012 | RAKUTEN GROUP, INC | Information processing apparatus, information processing method, information processing program, recording medium storing thereon information processing program |
9876981, | Oct 22 2010 | Saturn Licensing LLC | Operational terminal device, display control device, method of operating terminal device, method of operating display control device, and system |
ER2757, | |||
ER6267, |
Patent | Priority | Assignee | Title |
5602997, | Aug 27 1992 | Intellisync Corporation | Customizable program control interface for a computer system |
5742286, | Nov 20 1995 | International Business Machines Corporation | Graphical user interface system and method for multiple simultaneous targets |
6983424, | Jun 23 2000 | International Business Machines Corporation | Automatically scaling icons to fit a display area within a data processing system |
7546543, | Jan 07 2005 | Apple Inc | Widget authoring and editing environment |
7774295, | Nov 17 2004 | Targit A/S; TARGIT A S | Database track history |
7978176, | Jan 07 2007 | Apple Inc. | Portrait-landscape rotation heuristics for a portable multifunction device |
20050066280, | |||
20050216841, | |||
20060020970, | |||
20060158838, | |||
20070067739, | |||
CN1936504, | |||
JP10293676, | |||
JP11045281, | |||
JP11045283, | |||
JP11205671, | |||
JP2001325602, | |||
JP2003235086, | |||
JP2005149279, | |||
JP2005260643, | |||
JP2006164058, | |||
JP2006203356, | |||
JP200748161, | |||
JP2008129700, | |||
JP9244848, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 23 2010 | Panasonic Corporation | (assignment on the face of the patent) | / | |||
Oct 18 2010 | YUKI, YASUHIRO | Panasonic Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 025735 | /0922 | |
May 27 2014 | Panasonic Corporation | Panasonic Intellectual Property Corporation of America | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 033033 | /0163 |
Date | Maintenance Fee Events |
Nov 10 2016 | ASPN: Payor Number Assigned. |
Dec 18 2016 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Sep 23 2020 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Mar 17 2025 | REM: Maintenance Fee Reminder Mailed. |
Date | Maintenance Schedule |
Jul 30 2016 | 4 years fee payment window open |
Jan 30 2017 | 6 months grace period start (w surcharge) |
Jul 30 2017 | patent expiry (for year 4) |
Jul 30 2019 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 30 2020 | 8 years fee payment window open |
Jan 30 2021 | 6 months grace period start (w surcharge) |
Jul 30 2021 | patent expiry (for year 8) |
Jul 30 2023 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 30 2024 | 12 years fee payment window open |
Jan 30 2025 | 6 months grace period start (w surcharge) |
Jul 30 2025 | patent expiry (for year 12) |
Jul 30 2027 | 2 years to revive unintentionally abandoned end. (for year 12) |