A method and system are disclosed, for providing a user interface with a graphical user interface (gui) computer system. The method comprising the steps of receiving a user input command signal, the signal including first and second signals, representative of movements of respective first and second user input mechanisms for two-dimensional movements, resolving the first and second signals from the user input command signal, operating a first displayed symbol based on the first signal, and operating a second displayed symbol based on the second signal. The invention is advantageously practiced in an environment in which the user has a plurality of two-dimensional movement input devices, such as a mouse having a joystick-type pointing device as well as the surface contact ball on its underside. A system in accordance with the invention provides the user with many advantageous features, such as the ability to scroll, in the up/down and left/right directions, the content of an image partially displayed by an application window, to move forward/backward through a sequence of frames displayed by an application window, to move a cursor over the gui display in the up/down and left/right directions, and to move a special function sub-window, such as a magnifier, over the gui display in the up/down and left/right directions.
|
1. A method for providing a user interface with a graphical user interface (gui) computer system, the gui computer system including first and second user input mechanisms, which produce respective first and second directional movement signals, the gui computer system further including means for simultaneously displaying first and second movable displayed symbols, the method comprising the steps of:
receiving the first and second directional movement signals for allowing a user to direct two-dimensional movements of respective ones of the first and second displayed symbols; multiplexing the first and second directional movement signals to produce a user input command signal; receiving the user input command signal; resolving the first and second signals from the user input command signal; operating the first displayed symbol based on the first signal; and operating the second displayed symbol based on the second signal, wherein, upon actuation of any of said first and second user input mechanisms, a gui display window is opened, said window, at any given time, applying to one of said first and second user input mechanisms, such that said window identifies to which of said first and second user input mechanisms the window currently applies. said window allowing switching from said one of said first and second user input mechanisms to the other of said first and second user input mechanisms.
13. A system for providing a user interface with a graphical user interface (gui) computer system, the gui computer system including first and second user input mechanisms, which produce respective first and second directional movement signals, the gui computer system further including means for simultaneously displaying first and second movable displayed symbols, the system comprising:
means for receiving the first and second directional movement signals for allowing a user to direct two-dimensional movements of respective ones of the first and second displayed symbols; means for multiplexing the first and second directional movement signals to produce a user input command signal; means for receiving the user input command signal; means for resolving the first and second signals from the user input command signal; means for operating the first displayed symbol based on the first signal; and means for operating the second displayed symbol based on the second signal, wherein, upon actuation of any of said first and second user input mechanisms, a gui display window is opened, said window, at any given time, applying to one of said first and second user input mechanisms, such that said window identifies to which of said first and second user input mechanisms the window currently applies, said window allowing switching from said one of said first and second user input mechanisms to the other of said first and second user input mechanisms.
25. A computer program product, for use with a graphical user interface (gui) computer system, for providing a user interface with the computer system, the computer system including first and second user input mechanisms, which produce respective first and second directional movement signals, the computer system further including means for simultaneously displaying first and second movable displayed symbols, the computer program product comprising:
a computer readable medium; means, provided on the medium, for directing the computer system to receive the first and second directional movement signals for allowing a user to direct two-dimensional movements of respective ones of the first and second displayed symbols; means, provided on the medium, for directing the computer system to multiplex the first and second directional movement signals to produce a user input command signal; means, provided on the medium, for directing the computer system to receive the user input command signal; means, provided on the medium, for directing the computer system to resolve the first and second signals from the user input command signal; means, provided on the medium, for directing the computer system to operate the first displayed symbol based on the first signal; and means, provided on the medium, for directing the computer system to operate the second displayed symbol based on the second signal, wherein, upon actuation of any of said first and second user input mechanisms, a gui display window is opened, said window, at any given time, applying to one of said first and second user input mechanisms, such that said window identifies to which of said first and second user input mechanisms the window currently applies, said window allowing switching from said one of said first and second user input mechanisms to the other of said first and second user input mechanisms.
2. A method as recited in
extracting the units of information from the user input command signal: and generating the first and second signals from the information units based on the associated ID tags.
3. A method as recited in
processing the first signal as a mouse movement signal; and processing the second signal separately.
4. A method as recited in
(a) for the steps of operating the first and second displayed symbols based, respectively, on the first and second signals, the displayed symbol includes one of: a cursor, an application window or full-screen application (hereinafter an "application window"), and a special function sub-window; and (b) each of the steps of operating includes one of the steps of: scrolling, in the up/down and left/right directions, the content of an image partially displayed by an application window, moving forward/backward through a sequence of frames displayed by an application window, moving a cursor over the gui display in the up/down and left/right directions, and moving a special function sub-window over the gui display in the up/down and left/right directions. 5. A method as recited in
positioning a cursor within the application window containing the partly displayed image; and scrolling the content simultaneously in the up/down and left/right directions as per the signal.
6. A method as recited in
7. A method as recited in
mapping click buttons of the user input mechanism to the forward and backward directions of movement through the sequence of frames; and displaying successive earlier or later frames of the sequence of frames responsive to clicks of the respective click buttons.
8. A method as recited in
the application window includes a Web browser; and the step of displaying successive earlier or later frames includes displaying successive earlier or later Web pages.
9. A method as recited in
performing a context-switching operation in which control of a cursor displayed in the application window is switched to one of the user input mechanisms; and thereafter, moving the cursor responsive to user manipulation of the one of the user input mechanisms.
10. A method as recited in
11. A method as recited in
(a) mapping the click buttons to functions relating to the power of magnification of the magnifier window; and (b) changing the magnification of the magnifier window responsive to clicks of the respective click buttons.
12. A method as recited in
14. A system as recited in
means for extracting the units of information from the user input command signal: and means for generating the first and second signals from the information units based on the associated ID tags.
15. A system as recited in
means for processing the first signal as a mouse movement signal; and means for processing the second signal separately.
16. A system as recited in
(a) for the means for operating the first and second displayed symbols based, respectively, on the first and second signals, the displayed symbol includes one of: a cursor, an application window or full-screen application (hereinafter an "application window"), and a special function sub-window; and (b) each of the means operating includes means for performing one of the steps of: scrolling, in the up/down and left/right directions, the content of an image partially displayed by an application window, moving forward/backward through a sequence of frames displayed by an application window, moving a cursor over the gui display in the up/down and left/right directions, and moving a special function sub-window over the gui display in the up/down and left/right directions. 17. A system as recited in
means for positioning a cursor within the application window containing the partly displayed image; and means for scrolling the content simultaneously in the up/down and left/right directions as per the signal.
18. A system as recited in
19. A system as recited in
means for mapping click buttons of the user input mechanism to the forward and backward directions of movement through the sequence of frames; and means for displaying successive earlier or later frames of the sequence of frames responsive to clicks of the respective click buttons.
20. A system as recited in
the application window includes a Web browser; and the means for displaying successive earlier or later frames includes means for displaying successive earlier or later Web pages.
21. A system as recited in
means for performing a context-switching operation in which control of a cursor displayed in the application window is switched to one of the user input mechanisms; and means, operable thereafter, for moving the cursor responsive to user manipulation of the one of the user input mechanisms.
22. A system as recited in
23. A system as recited in
(a) means for mapping the click buttons to functions relating to the power of magnification of the magnifier window; and (b) means for changing the magnification of the magnifier window responsive to clicks of the respective click buttons.
24. A system as recited in
26. A computer program product as recited in
means, provided on the medium, for directing the computer system to extract the units of information from the user input command signal; and means, provided on the medium, for directing the computer system to generate the first and second signals from the information units based on the associated ID tags.
27. A computer program product as recited in
means, provided on the medium, for directing the computer system to process the first signal as a mouse movement signal; and means, provided on the medium, for directing the computer system to process the second signal separately.
28. A computer program product as recited in
(a) for the means for directing to operate the first and second displayed symbols based, respectively, on the first and second signals, the displayed symbol includes one of: a cursor, an application window or full-screen application (hereinafter an "application window"), and a special function sub-window; and (b) each of the means for directing to operate includes means, provided on the medium, for directing the computer system to perform one of the steps of: scrolling, in the up/down and left/right directions, the content of an image partially displayed by an application window, moving forward/backward through a sequence of frames displayed by an application window, moving a cursor over the gui display in the up/down and left/right directions, and moving a special function sub-window over the gui display in the up/down and left/right directions. 29. A computer program product as recited in
means, provided on the medium, for directing the computer system to position a cursor within the application window containing the partly displayed image; and means, provided on the medium, for directing the computer system to scroll the content simultaneously in the up/down and left/right directions as per the signal.
30. A computer program product as recited in
31. A computer program product as recited in
means, provided on the medium, for directing the computer system to map click buttons of the user input mechanism to the forward and backward directions of movement through the sequence of frames; and means, provided on the medium, for directing the computer system to display successive earlier or later frames of the sequence of frames responsive to clicks of the respective click buttons.
32. A computer program product as recited in
the application window includes a Web browser; and the means, provided on the medium, for directing the computer system to display successive earlier or later frames includes means, provided on the medium, for directing the computer system to display successive earlier or later Web pages.
33. A computer program product as recited in
means, provided on the medium, for directing the computer system to perform a context-switching operation in which control of a cursor displayed in the application window is switched to one of the user input mechanisms; and means, provided on the medium, operable thereafter, for directing the computer system to move the cursor responsive to user manipulation of the one of the user input mechanisms.
34. A computer program product as recited in
35. A computer program product as recited in
(a) means, provided on the medium, for directing the computer system to map the click buttons to functions relating to the power of magnification of the magnifier window; and (b) means, provided on the medium, for directing the computer system to change the magnification of the magnifier window responsive to clicks of the respective click buttons.
36. A computer program product as recited in
|
The invention generally relates to the field of computer graphical user interfaces (GUIs). More specifically, the invention relates to user interface systems and methods for supporting activities such as symbol movement and selection on a GUI system.
Since the advent of the graphical user interface (GUI) in the early 1980s, computers have employed, as user input devices, devices that allow a user to perform two basic functions, first, a two-dimensional movement function, such as moving a cursor around on a two-dimensional display, and, second, a pulsing or "clicking" function, that allows a user to select a function associated with a particular position on the display.
In a GUI, information displayed includes object symbols, such as windows, icons, slider bars, soft "buttons", etc. The two-dimensional movement function allows a user to move a cursor to an area of the screen within which a desired object symbol is located. The clicking function allows the user to select, operate, or manipulate the object symbols, and, in so doing, perform computer operations.
The number of such functions that a GUI supports is wide and varied. For instance, any window that shows a portion of an image (such as a word processor showing a portion of a lengthy document) provides for scrolling through the image. A slider bar is provided for this purpose. For incremental scrolling, the user moves the cursor to an up or down arrow, and holds a click button down to invoke the scrolling function. For long-distance moves, the user positions the cursor on a slider block, holds a click button down, and moves the mouse to drag the slider block along the slider bar. A portion of the image, in a position proportional to the position of the block along the slider bar, is displayed. This arrangement is both easy and intuitive for the user.
Other functions include the "drag and drop" function, similar to that used with the slider block, but applicable to icons and other objects in the GUI display.
As application software has increased in sophistication and "user-friendliness," more and more of the functionality of a computer has migrated from a typewriter-style keyboard to a user interface device providing this functionality.
The most commonly employed user input mechanism is a mouse. A mouse is a hand-held device having a surface contact member such as a ball. The user moves the mouse over a work surface such as a table top, causing the ball to roll. Sensors within the device detect the rolling, and translate it into two-dimensional movement signals analogous to the user's movement of the mouse. The signals are sent over a wire to a computer, in accordance with a known mouse interface protocol. The computer runs a mouse driver application, which interprets the movement signals and directs the movement, on the GUI, of a symbol such as a cursor. The mouse also has click buttons, preferably two, which are positioned so that the user can conveniently press the buttons with his/her fingers, without having to change the grip on the mouse.
Computer software, in the form of "mouse driver" programs, have been employed along with these physical apparatus. A mouse driver essentially receives signals through a mouse interface (typically a serial cable), interprets the signals in terms of movement (two dimensions, in the plus and minus directions each) and selection (mouse button clicking), and directs the operating system and/or application programs to perform a desired function.
A major advance was made when IBM Corporation developed the TrackPoint II™ and TrackPoint III™ pointing device (hereinafter generally referred to as "TrackPoint devices"). A TrackPoint device includes a small, joystick-like member which is mounted in a keyboard, between the keys. Click buttons are provided on the keyboard also, preferably centered and in front of a space bar.
The TrackPoint device enhanced the portability of small, laptop computers, because all the functionality of a mouse fit within the keyboard. It was not necessary to carry the mouse separately, or to find a flat surface for using the mouse.
However, because graphical user interfaces are so powerful, more sophisticated ways of exploiting the user interface capabilities have been pursued. For instance, since GUI applications provide scrolling functions as well as symbol selection functions, and since functions both inside an application and outside on the desktop/operating system employ selection functions, it is likely to be a convenience for the user to have multiple cursor manipulation apparatus.
To further expand the capabilities of a user interface device for use with a GUI computer, mice have had added apparatus to provide, in effect, a Z axis of movement, to go along with the X and Y axes of movement provided by ordinary mice. For instance, in U.S. Pat. No. 5,530,455, Gillick et al., "Roller Mouse for Implementing Scrolling in Windows Applications" and U.S. Pat. No. 5,446,481, Gillick et al., :Multidimensional Hybrid Mouse for Computers", a Z axis roller is added.
However, these additional features are limited in their use, since they are useful only for tasks for which one additional dimension is needed.
IBM Corp. has developed a keyboard with two TrackPoint devices, positioned at two different sites within the keyboard. Because there are two such devices, each having full two-dimensional capability, added functionality and flexibility are realized.
Accordingly, the user, when choosing a type of interface device to use, has many options. In fact, a user's manual dexterity is capable of making effective use of a plurality of such devices. However, heretofore, GUI software has been limited in its ability to support user commands. For instance, in a word processor, a user must use the same cursor, and the same interface mechanism, for scrolling through a document, selecting text for cutting and pasting, etc. Also, if the user is both operating an application and moving or invoking objects on the desktop, the same cursor and interface device are again used. Therefore, conventional GUI software and interface device drivers have had the drawback of limiting the user's efficiency and productivity by not allowing users to take full advantage of the graphical user interface's capacity to perform quick and convenient functions, responsive to user commands.
It is therefore an object of the invention to provide a method and system for providing a user interface with a graphical user interface (GUI) computer system.
The method of the invention comprises the following steps:
First, a user input command signal is received. In accordance with the invention, the signal is compatible in format with conventional user interface signals, such as mouse signals. However, the user interface signal includes first and second signals, which are representative of movements of respective first and second user input mechanisms for two-dimensional movements.
The user input signal is then resolved, by a demultiplexing process, into first and second signals. The first and second signals are representative of two-dimensional movements of first and second, distinct, user interface devices. A preferred embodiment of the invention employs a user interface device which is the subject of co-pending, co-assigned U.S. patent application Ser. No. 80/706,019, filed Aug. 30, 1996, Barber et al., "Hand Held Computer Interface Device Having Multiple Two-Dimensional User Command Inputs." directed to a mouse having a joystick-type pointing device disposed thereon. Thus, the first signal is produced by mouse movement, in a familiar manner. The second signal is produced by the pointing device. The two signals are multiplexed on board the mouse, to produce the above-mentioned user input signal.
After the user input signal as been resolved into the first and second signals, the method proceeds by operating a first displayed symbol based on the first signal, and operating a second displayed symbol based on the second signal.
These steps of operating include any of the following:
(1) scrolling, in the up/down and left/right directions, the content of an image partially displayed by an application window,
(2) moving forward/backward through a sequence of frames displayed by an application window,
(3) moving a cursor over the GUI display in the up/down and left/right directions, and
(4) moving a special function sub-window over the GUI display in the up/down and left/right directions.
Accordingly, a system and method according to the invention allow the user to take better advantage of multiple, two-dimensional user input command mechanisms to further enhance the ease and efficiency offered by the GUI environment.
The invention is advantageously used in connection with a user input device, such as a mouse having a TrackPoint device as well as a surface contact ball for movement over a work surface, as described in co-pending, co-assigned U.S. patent application Ser. No. 80/706,019.
While the invention is primarily disclosed as a method, it will be understood by a person of ordinary skill in the art that an apparatus, such as a conventional data processor, including a CPU, memory, I/O, program storage, a connecting bus, and other appropriate components, could be programmed or otherwise designed to facilitate the practice of the method of the invention. Such a processor would include appropriate program means for executing the method of the invention.
Also, an article of manufacture, such as a pre-recorded disk or other similar computer program product, for use with a data processing system, could include a storage medium and program means recorded thereon for directing the data processing system to facilitate the practice of the method of the invention. It will be understood that such apparatus and articles of manufacture also fall within the spirit and scope of the invention.
FIG. 1 is a system block diagram of the invention and its operating environment.
FIG. 2 is a block diagram showing the functionality of a user input device for use with the invention.
FIG. 3 is a view of a GUI display showing functionality used in accordance with the invention.
FIG. 4 is a view of a GUI display showing functionality used in accordance with the invention.
FIG. 1 is a block diagram of a computer system employing the method of the invention. A user command input device 2 produces movement and selection signals, which are provided through an interface such as a standard mouse cable 4 to a computer 6.
In accordance with the invention, the device 2 includes a plurality of two-dimensional movement inputs, and produces a multiplexed signal which is compatible with the standard mouse interface, but which conveys user command information from the plurality of inputs.
A more detailed illustration of the operation of the device 2 is given in FIG. 2. First and second mechanisms 8 and 10 (such as a mouse surface contact ball and a TrackPoint device) produce signals which are provided to a multiplexer 12. In accordance with standard practice for providing mouse signals to a computer, the signals produced by the mechanisms 8 and 10 include a sequence of segmented, or segmentable, messages.
Note, by the way, that, for the two-TrackPoint-device keyboard mentioned in the Background section, the interface with the computer provides two separate links, so that the GUI support software within the computer separately handles the movement signals from the two TrackPoint devices. The interface here described does not require the extra interface channel.
The multiplexer 12 receives the message segments and tags them with a ID specifying which mechanism they came from. The resultant multiplexed signal is thus in a general format 14, in which a single signal stream includes packets in a sequence, each packet including a tag identifying one of the mechanisms 8 and 10 as a source, and including a message representative of a movement signal and/or click button signal produced by that mechanism. That resultant signal is provided to an interface 16, such as the standard mouse cable 4 of FIG. 1. Note that the interface 16 is a single interface, so that an apparatus in accordance with the invention does not require multiple interfaces, as did the above-discussed prior art IBM keyboard having two TrackPoint devices. Note, further, that the single interface 16 is backward compatible with conventional mouse interfaces.
Returning to FIG. 1, the multiplexed signal 14 is received over the cable 4 at an input mouse interface 18. The signal is provided to a mouse driver 20. The driver 20 includes a demultiplexer, which demultiplexes the signal to produce separate signals corresponding with the separate user input mechanisms 8 and 10. In the course of this demultiplexing, the ID tags are stripped out, and the various segmented messages are grouped separately, and provided as separate outputs.
A bus 22, shown in FIG. 1, schematically represents a plurality of separate lines, functional routes, etc., for carrying the respective signals produced by the demultiplexer. Suitable implementations for separately providing the demultiplexed signals may be used, in accordance with the particular system requirements.
The bus 22 routes the signals separately to the appropriate destinations. The destinations are selectable by the user, in a manner described below. For illustrative purposes, the various lines of the bus 22 are coupled to various software modules, including a device driver 40, an operating system 42, and applications 44 and 46. As per the operation of those software modules, and in accordance with the user's manipulation of the various user input mechanisms to produce the respective signals, the software modules 40, 42, 44, and 46 cause information to be displayed on a display 48 as per the detailed descriptions which follow.
Referring to FIG. 3, part of the functionality of the mouse driver 20 of the invention is to provide a user interface window, as shown in FIG. 3.
It is assumed that a desired number of user interface mechanisms are suitably coupled to the computer 6. For instance, a mouse having a TrackPoint device may be installed to the mouse port of the computer 6, thereby providing two user interface mechanisms: first, the mouse itself, with motion determined by the mouse's surface contact ball, and second, the TrackPoint device.
In accordance with the invention, the driver 20 maps the mechanisms to internally maintained IDs. For each ID, the driver provides the user with configuration software, preferably in the form of a window. When the driver is installed, an icon appears, such as on the desktop. When the user selects the icon, a window such as that of FIG. 3 appears.
The window, at any given time, applies to one of the user interface mechanisms, and so identifies itself to the user. For instance, the mouse surface contact ball might be mapped as device 1, and the TrackPoint device mapped as device 2. As shown, the header at the top of the window identifies which device the window currently applies to.
The window also includes a function for switching to another device, shown as a software switch 26. Where only two user input mechanisms are used, the switch 26 can simply toggle between them. Where a greater number of devices are available to the user, the switch 26 may be replaced by a button which, when pressed, causes a menu of installed devices to be displayed. Then, the user simply selects the user input mechanism he/she wants to work with.
The window includes a bank of functions 28, and soft switches for allowing the user to select which one of the desired functions the selected user input mechanism is to apply to. (Suitable safeguards may be used for warning or disallowing when one user input mechanism is to be set to a function already allocated to another user input mechanism.)
Additional controls, shown generally as 30, 32, and 34 may be provided as appropriate for providing user control over the magnitude of operation of the particular task that the user input mechanism is assigned to. This will be explained in more detail in connection with the descriptions of the functions which follow.
Depending on the application being run, a wide variety of different operations may advantageously employ concurrent cursor activities such as those made possible by an apparatus according to the invention. The functions are selectable from the bank 28.
One function supported in accordance with the invention is that of two-dimensional scrolling ("AutoScroll"). Where a window displays a portion of a larger image, such as a lengthy document in a word processor, a user input mechanism may be used for two-dimensional scrolling. A noteworthy advantage of using a two-dimensional user input mechanism for scrolling is that the image can be scrolled in both the X and Y directions. Conventionally, scrolling has been done by pressing a click button when the cursor is positioned on a slider bar or equivalent graphical tool, so scrolling could only be done in one direction at a time.
Also, the speed of scrolling may be related to the magnitude of manipulation of the user input mechanism. For instance, where a TrackPoint device is so used, the scrolling speed is related to the magnitude of the force exerted by the user's fingertip. A suitable force-to-speed transfer function is preferably used, such as that described in co-assigned U.S. patent application Ser. No. 08/316,983, Barrett et al., "Graphical User Interface Cursor Positioning Device Having a Negative Inertia Transfer Function," now issued as U.S. Pat. No. 5,570,111.
The controls 30, 32, and 34, or controls having other suitable formats, may be used for allowing the user to select a desired transfer function, scale the transfer function to modify the sensitivity and responsiveness, etc.
Another function is designated in FIG. 3 as "Web Scroll". This function assumes that an application has displayed a sequence of pages, documents, etc., and that a history of that sequence has been maintained. The user input mechanism is used here to step, forward and backward, through the sequence of displayed pages. Thus, this function has particular applicability to Web browsers.
A preferred implementation is to map the click buttons of the user input mechanism to the "forward" and "backward" functions of the applications. For instance, clicking the left mouse button might go to the previously viewed page, and clicking the right button might go to the next-viewed page.
An alternative implementation would be to use the two-dimensional movement element of the user input mechanism for stepping through the sequence of pages. For instance, a leftward press of a TrackPoint device, having a duration and magnitude which satisfies a predetermined threshold or other criterion, might be recognized as a user command to go to the previously viewed page. A similar rightward press is interpreted as a command to go to the next-viewed page. For details on the recognition of particular user input signatures for alternative functions, see co-assigned U.S. patent application Ser. No. 08/483,594, Marks et al., "Enhanced Program Access in a Graphical User Interface," now issued as U.S. Pat. No. 5,586,243.
Yet another function is to allow for two cursors ("Two Cursor"). Within some individual applications, this can be useful. For instance, in a word processor, one cursor can be used for scrolling, while another can be used for selecting text for cutting and pasting. See also co-pending, co-assigned U.S. patent application Ser. No. 08/631,110, Barber et al., "Multiple Display Pointers for Computer Graphical User Interfaces."
Alternatively, one cursor can be used within an application, while another is used by the operating system, such as on a desktop.
Finally, a user input mechanism can be mapped, preferably by the operating system but alternatively within an application, to a special function sub-window, such as a magnifier (FIG. 4). In the illustrated example, the special function sub-window is a magnifier 36, which is positioned on a word processor application window 38. As shown, the magnifier 36 magnifies the GUI content (such as text) which appears on the display in the position where the magnifier is located.
In accordance with the invention, a user input mechanism is mapped to the magnifier 36. As the user input mechanism is manipulated to produce two-dimensional movement, the magnifier 36 moves, in the same manner that a cursor would move. The dimensions of the sub-window, the magnification power, and other suitable parameters may be adjusted by the user through use of controls such as the controls 30, 32, and 34 of FIG. 3. Alternatively, a series of predetermined values for the sub-window size, magnification power, or the like can be established. The click buttons are configured to assign the click buttons to that selectable function. Then, the user can increase or decrease the function, such as the magnification power, by clicking the left or right buttons to increase and decrease it, respectively, through the predetermined set of values.
It is believed that the special function sub-window, where used as a magnifier, may best be supported by the operating system, so that it can be moved all over the display, across window boundaries. However, other special functions may be particularly suited for use within a particular application, and would then be limited within the application window.
Note, in general, that applications, etc., have been described in terms of "windows." It should be understood that this terminology is not intended to limit the scope of GUI applications which can advantageously employ the invention. For instance, if an application is displayed in "full screen" mode, rather than as a window occupying only a portion of the display, all of the principles pertaining to the above discussion apply equally.
Using the foregoing specification, the invention may be implemented using standard programming and/or engineering techniques using computer programming software, firmware, hardware or any combination or subcombination thereof. Any such resulting program(s), having computer readable program code means, may be embodied or provided within one or more computer readable or usable media such as fixed (hard) drives, disk, diskettes, optical disks, magnetic tape, semiconductor memories such as read-only memory (ROM), etc., or any transmitting/receiving medium such as the Internet or other communication network or link, thereby making a computer program product, i.e., an article of manufacture, according to the invention. The article of manufacture containing the computer programming code may be made and/or used by executing the code directly from one medium, by copying the code from one medium to another medium, or by transmitting the code over a network.
An apparatus for making, using, or selling the invention may be one or more processing systems including, but not limited to, a central processing unit (CPU), memory, storage devices, communication links, communication devices, servers, I/O devices, or any subcomponents or individual parts of one or more processing systems, including software, firmware, hardware or any combination or subcombination thereof, which embody the invention as set forth in the claims.
User input may be received from the keyboard, mouse, pen, voice, touch screen, or any other means by which a human can input data to a computer, including through other programs such as application programs.
One skilled in the art of computer science will easily be able to combine the software created as described with appropriate general purpose or special purpose computer hardware to create a computer system and/or computer subcomponents embodying the invention and to create a computer system and/or computer subcomponents for carrying out the method of the invention. While the preferred embodiment of the present invention has been illustrated in detail, it should be apparent that modifications and adaptations to that embodiment may occur to one skilled in the art without departing from the spirit or scope of the present invention as set forth in the following claims.
Patent | Priority | Assignee | Title |
10031656, | May 28 2008 | GOOGLE LLC | Zoom-region indicator for zooming in an electronic interface |
10049663, | Jun 08 2016 | Apple Inc | Intelligent automated assistant for media exploration |
10049668, | Dec 02 2015 | Apple Inc | Applying neural network language models to weighted finite state transducers for automatic speech recognition |
10049675, | Feb 25 2010 | Apple Inc. | User profiling for voice input processing |
10057736, | Jun 03 2011 | Apple Inc | Active transport based notifications |
10067938, | Jun 10 2016 | Apple Inc | Multilingual word prediction |
10074360, | Sep 30 2014 | Apple Inc. | Providing an indication of the suitability of speech recognition |
10078631, | May 30 2014 | Apple Inc. | Entropy-guided text prediction using combined word and character n-gram language models |
10079014, | Jun 08 2012 | Apple Inc. | Name recognition system |
10083688, | May 27 2015 | Apple Inc | Device voice control for selecting a displayed affordance |
10083690, | May 30 2014 | Apple Inc. | Better resolution when referencing to concepts |
10089072, | Jun 11 2016 | Apple Inc | Intelligent device arbitration and control |
10101822, | Jun 05 2015 | Apple Inc. | Language input correction |
10102359, | Mar 21 2011 | Apple Inc. | Device access using voice authentication |
10108612, | Jul 31 2008 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
10127220, | Jun 04 2015 | Apple Inc | Language identification from short strings |
10127911, | Sep 30 2014 | Apple Inc. | Speaker identification and unsupervised speaker adaptation techniques |
10134385, | Mar 02 2012 | Apple Inc.; Apple Inc | Systems and methods for name pronunciation |
10169329, | May 30 2014 | Apple Inc. | Exemplar-based natural language processing |
10170123, | May 30 2014 | Apple Inc | Intelligent assistant for home automation |
10176167, | Jun 09 2013 | Apple Inc | System and method for inferring user intent from speech inputs |
10185542, | Jun 09 2013 | Apple Inc | Device, method, and graphical user interface for enabling conversation persistence across two or more instances of a digital assistant |
10186254, | Jun 07 2015 | Apple Inc | Context-based endpoint detection |
10192552, | Jun 10 2016 | Apple Inc | Digital assistant providing whispered speech |
10199051, | Feb 07 2013 | Apple Inc | Voice trigger for a digital assistant |
10223066, | Dec 23 2015 | Apple Inc | Proactive assistance based on dialog communication between devices |
10241644, | Jun 03 2011 | Apple Inc | Actionable reminder entries |
10241752, | Sep 30 2011 | Apple Inc | Interface for a virtual digital assistant |
10249300, | Jun 06 2016 | Apple Inc | Intelligent list reading |
10255907, | Jun 07 2015 | Apple Inc. | Automatic accent detection using acoustic models |
10269345, | Jun 11 2016 | Apple Inc | Intelligent task discovery |
10276170, | Jan 18 2010 | Apple Inc. | Intelligent automated assistant |
10283110, | Jul 02 2009 | Apple Inc. | Methods and apparatuses for automatic speech recognition |
10289433, | May 30 2014 | Apple Inc | Domain specific language for encoding assistant dialog |
10297253, | Jun 11 2016 | Apple Inc | Application integration with a digital assistant |
10311871, | Mar 08 2015 | Apple Inc. | Competing devices responding to voice triggers |
10318871, | Sep 08 2005 | Apple Inc. | Method and apparatus for building an intelligent automated assistant |
10354011, | Jun 09 2016 | Apple Inc | Intelligent automated assistant in a home environment |
10366158, | Sep 29 2015 | Apple Inc | Efficient word encoding for recurrent neural network language models |
10381016, | Jan 03 2008 | Apple Inc. | Methods and apparatus for altering audio output signals |
10431204, | Sep 11 2014 | Apple Inc. | Method and apparatus for discovering trending terms in speech requests |
10446141, | Aug 28 2014 | Apple Inc. | Automatic speech recognition based on user feedback |
10446143, | Mar 14 2016 | Apple Inc | Identification of voice inputs providing credentials |
10475446, | Jun 05 2009 | Apple Inc. | Using context information to facilitate processing of commands in a virtual assistant |
10490187, | Jun 10 2016 | Apple Inc | Digital assistant providing automated status report |
10496753, | Jan 18 2010 | Apple Inc.; Apple Inc | Automatically adapting user interfaces for hands-free interaction |
10497365, | May 30 2014 | Apple Inc. | Multi-command single utterance input method |
10509862, | Jun 10 2016 | Apple Inc | Dynamic phrase expansion of language input |
10521466, | Jun 11 2016 | Apple Inc | Data driven natural language event detection and classification |
10552013, | Dec 02 2014 | Apple Inc. | Data detection |
10553209, | Jan 18 2010 | Apple Inc. | Systems and methods for hands-free notification summaries |
10567477, | Mar 08 2015 | Apple Inc | Virtual assistant continuity |
10568032, | Apr 03 2007 | Apple Inc. | Method and system for operating a multi-function portable electronic device using voice-activation |
10592095, | May 23 2014 | Apple Inc. | Instantaneous speaking of content on touch devices |
10593346, | Dec 22 2016 | Apple Inc | Rank-reduced token representation for automatic speech recognition |
10607140, | Jan 25 2010 | NEWVALUEXCHANGE LTD. | Apparatuses, methods and systems for a digital conversation management platform |
10607141, | Jan 25 2010 | NEWVALUEXCHANGE LTD. | Apparatuses, methods and systems for a digital conversation management platform |
10657961, | Jun 08 2013 | Apple Inc. | Interpreting and acting upon commands that involve sharing information with remote devices |
10659851, | Jun 30 2014 | Apple Inc. | Real-time digital assistant knowledge updates |
10671428, | Sep 08 2015 | Apple Inc | Distributed personal assistant |
10679605, | Jan 18 2010 | Apple Inc | Hands-free list-reading by intelligent automated assistant |
10691473, | Nov 06 2015 | Apple Inc | Intelligent automated assistant in a messaging environment |
10705794, | Jan 18 2010 | Apple Inc | Automatically adapting user interfaces for hands-free interaction |
10706373, | Jun 03 2011 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
10706841, | Jan 18 2010 | Apple Inc. | Task flow identification based on user intent |
10733993, | Jun 10 2016 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
10747498, | Sep 08 2015 | Apple Inc | Zero latency digital assistant |
10762293, | Dec 22 2010 | Apple Inc.; Apple Inc | Using parts-of-speech tagging and named entity recognition for spelling correction |
10789041, | Sep 12 2014 | Apple Inc. | Dynamic thresholds for always listening speech trigger |
10791176, | May 12 2017 | Apple Inc | Synchronization and task delegation of a digital assistant |
10791216, | Aug 06 2013 | Apple Inc | Auto-activating smart responses based on activities from remote devices |
10795541, | Jun 03 2011 | Apple Inc. | Intelligent organization of tasks items |
10810274, | May 15 2017 | Apple Inc | Optimizing dialogue policy decisions for digital assistants using implicit feedback |
10904611, | Jun 30 2014 | Apple Inc. | Intelligent automated assistant for TV user interactions |
10963143, | Sep 09 2015 | HUAWEI TECHNOLOGIES CO ,LTD | Data editing method and apparatus |
10978090, | Feb 07 2013 | Apple Inc. | Voice trigger for a digital assistant |
10984326, | Jan 25 2010 | NEWVALUEXCHANGE LTD. | Apparatuses, methods and systems for a digital conversation management platform |
10984327, | Jan 25 2010 | NEW VALUEXCHANGE LTD. | Apparatuses, methods and systems for a digital conversation management platform |
11010550, | Sep 29 2015 | Apple Inc | Unified language modeling framework for word prediction, auto-completion and auto-correction |
11025565, | Jun 07 2015 | Apple Inc | Personalized prediction of responses for instant messaging |
11037565, | Jun 10 2016 | Apple Inc. | Intelligent digital assistant in a multi-tasking environment |
11069347, | Jun 08 2016 | Apple Inc. | Intelligent automated assistant for media exploration |
11080012, | Jun 05 2009 | Apple Inc. | Interface for a virtual digital assistant |
11087759, | Mar 08 2015 | Apple Inc. | Virtual assistant activation |
11120372, | Jun 03 2011 | Apple Inc. | Performing actions associated with task items that represent tasks to perform |
11133008, | May 30 2014 | Apple Inc. | Reducing the need for manual start/end-pointing and trigger phrases |
11152002, | Jun 11 2016 | Apple Inc. | Application integration with a digital assistant |
11257504, | May 30 2014 | Apple Inc. | Intelligent assistant for home automation |
11314340, | Feb 13 2004 | CHEMTRON RESEARCH LLC | User interface device with touch sensor |
11405466, | May 12 2017 | Apple Inc. | Synchronization and task delegation of a digital assistant |
11410053, | Jan 25 2010 | NEWVALUEXCHANGE LTD. | Apparatuses, methods and systems for a digital conversation management platform |
11423886, | Jan 18 2010 | Apple Inc. | Task flow identification based on user intent |
11500672, | Sep 08 2015 | Apple Inc. | Distributed personal assistant |
11526368, | Nov 06 2015 | Apple Inc. | Intelligent automated assistant in a messaging environment |
11556230, | Dec 02 2014 | Apple Inc. | Data detection |
11587559, | Sep 30 2015 | Apple Inc | Intelligent device identification |
11797107, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Method and user interface device with touch sensor for controlling applications |
11809643, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Methods and systems for controlling applications using user interface device with touch sensor |
6392632, | Dec 08 1998 | Winbond Electronics Corp | Optical mouse having an integrated camera |
6542147, | Nov 30 1999 | HTC Corporation | Menu selection input device |
6877015, | Sep 04 1998 | Microsoft Technology Licensing, LLC | System and method for dynamically adjusting data values in response to remote user input |
6917373, | Dec 28 2000 | Microsoft Technology Licensing, LLC | Context sensitive labels for an electronic device |
7009599, | Nov 20 2001 | RPX Corporation | Form factor for portable device |
7030837, | Apr 24 2000 | Microsoft Technology Licensing, LLC | Auxiliary display unit for a computer system |
7075513, | Sep 04 2001 | RPX Corporation | Zooming and panning content on a display screen |
7143355, | Feb 28 2001 | Sony Corporation | Information processing device for processing information based on a status monitoring program and method therefor |
7227511, | Apr 24 2000 | Microsoft Technology Licensing, LLC | Method for activating an application in context on a remote input/output device |
7479947, | Nov 20 2001 | RPX Corporation | Form factor for portable device |
7512901, | Dec 28 2000 | Microsoft Technology Licensing, LLC | Context sensitive labels for an electronic device |
7545342, | Apr 24 2000 | Microsoft Technology Licensing, LLC | Auxiliary display unit for a computer system |
7557797, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Mouse-based user interface device providing multiple parameters and modalities |
7620915, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Electronic document editing employing multiple cursors |
7786952, | Apr 24 2000 | Microsoft Technology Licensing, LLC | Auxiliary display unit for a computer system |
7802202, | Mar 17 2005 | Microsoft Technology Licensing, LLC | Computer interaction based upon a currently active input device |
8004475, | Apr 24 2000 | Microsoft Technology Licensing, LLC | Auxiliary display unit for a computer system |
8744852, | Oct 01 2004 | Apple Inc. | Spoken interfaces |
8797271, | Feb 27 2008 | Microsoft Technology Licensing, LLC | Input aggregation for a multi-touch device |
8816956, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Mouse-based user interface device employing user-removable modules |
8887061, | Sep 26 2008 | Microsoft Technology Licensing, LLC | Variable screen magnifier user interface |
8892446, | Jan 18 2010 | Apple Inc. | Service orchestration for intelligent automated assistant |
8903716, | Jan 18 2010 | Apple Inc. | Personalized vocabulary for digital assistant |
8930191, | Jan 18 2010 | Apple Inc | Paraphrasing of user requests and results by automated digital assistant |
8942986, | Jan 18 2010 | Apple Inc. | Determining user intent based on ontologies of domains |
9041650, | Sep 18 2008 | Apple Inc. | Using measurement of lateral force for a tracking input device |
9117447, | Jan 18 2010 | Apple Inc. | Using event alert text as input to an automated assistant |
9262612, | Mar 21 2011 | Apple Inc.; Apple Inc | Device access using voice authentication |
9300784, | Jun 13 2013 | Apple Inc | System and method for emergency calls initiated by voice command |
9318108, | Jan 18 2010 | Apple Inc.; Apple Inc | Intelligent automated assistant |
9330720, | Jan 03 2008 | Apple Inc. | Methods and apparatus for altering audio output signals |
9338493, | Jun 30 2014 | Apple Inc | Intelligent automated assistant for TV user interactions |
9368114, | Mar 14 2013 | Apple Inc. | Context-sensitive handling of interruptions |
9417716, | Feb 13 2004 | CHEMTRON RESEARCH LLC | Mouse-based user interface device employing user-removable modules |
9430463, | May 30 2014 | Apple Inc | Exemplar-based natural language processing |
9483461, | Mar 06 2012 | Apple Inc.; Apple Inc | Handling speech synthesis of content for multiple languages |
9495129, | Jun 29 2012 | Apple Inc. | Device, method, and user interface for voice-activated navigation and browsing of a document |
9502031, | May 27 2014 | Apple Inc.; Apple Inc | Method for supporting dynamic grammars in WFST-based ASR |
9535906, | Jul 31 2008 | Apple Inc. | Mobile device having human language translation capability with positional feedback |
9548050, | Jan 18 2010 | Apple Inc. | Intelligent automated assistant |
9569079, | Feb 27 2008 | Microsoft Technology Licensing, LLC | Input aggregation for a multi-touch device |
9576574, | Sep 10 2012 | Apple Inc. | Context-sensitive handling of interruptions by intelligent digital assistant |
9582608, | Jun 07 2013 | Apple Inc | Unified ranking with entropy-weighted information for phrase-based semantic auto-completion |
9620104, | Jun 07 2013 | Apple Inc | System and method for user-specified pronunciation of words for speech synthesis and recognition |
9620105, | May 15 2014 | Apple Inc. | Analyzing audio input for efficient speech and music recognition |
9626955, | Apr 05 2008 | Apple Inc. | Intelligent text-to-speech conversion |
9633004, | May 30 2014 | Apple Inc.; Apple Inc | Better resolution when referencing to concepts |
9633660, | Feb 25 2010 | Apple Inc. | User profiling for voice input processing |
9633674, | Jun 07 2013 | Apple Inc.; Apple Inc | System and method for detecting errors in interactions with a voice-based digital assistant |
9639187, | Sep 22 2008 | Apple Inc. | Using vibration to determine the motion of an input device |
9646609, | Sep 30 2014 | Apple Inc. | Caching apparatus for serving phonetic pronunciations |
9646614, | Mar 16 2000 | Apple Inc. | Fast, language-independent method for user authentication by voice |
9658698, | Sep 18 2008 | Apple Inc. | Using measurement of lateral force for a tracking input device |
9668024, | Jun 30 2014 | Apple Inc. | Intelligent automated assistant for TV user interactions |
9668121, | Sep 30 2014 | Apple Inc. | Social reminders |
9697820, | Sep 24 2015 | Apple Inc. | Unit-selection text-to-speech synthesis using concatenation-sensitive neural networks |
9697822, | Mar 15 2013 | Apple Inc. | System and method for updating an adaptive speech recognition model |
9711141, | Dec 09 2014 | Apple Inc. | Disambiguating heteronyms in speech synthesis |
9715875, | May 30 2014 | Apple Inc | Reducing the need for manual start/end-pointing and trigger phrases |
9721566, | Mar 08 2015 | Apple Inc | Competing devices responding to voice triggers |
9734193, | May 30 2014 | Apple Inc. | Determining domain salience ranking from ambiguous words in natural speech |
9760559, | May 30 2014 | Apple Inc | Predictive text input |
9785630, | May 30 2014 | Apple Inc. | Text prediction using combined word N-gram and unigram language models |
9798393, | Aug 29 2011 | Apple Inc. | Text correction processing |
9818400, | Sep 11 2014 | Apple Inc.; Apple Inc | Method and apparatus for discovering trending terms in speech requests |
9842101, | May 30 2014 | Apple Inc | Predictive conversion of language input |
9842105, | Apr 16 2015 | Apple Inc | Parsimonious continuous-space phrase representations for natural language processing |
9851813, | Sep 18 2008 | Apple Inc. | Force sensing for fine tracking control of mouse cursor |
9858925, | Jun 05 2009 | Apple Inc | Using context information to facilitate processing of commands in a virtual assistant |
9865248, | Apr 05 2008 | Apple Inc. | Intelligent text-to-speech conversion |
9865280, | Mar 06 2015 | Apple Inc | Structured dictation using intelligent automated assistants |
9886432, | Sep 30 2014 | Apple Inc. | Parsimonious handling of word inflection via categorical stem + suffix N-gram language models |
9886953, | Mar 08 2015 | Apple Inc | Virtual assistant activation |
9899019, | Mar 18 2015 | Apple Inc | Systems and methods for structured stem and suffix language models |
9922642, | Mar 15 2013 | Apple Inc. | Training an at least partial voice command system |
9934775, | May 26 2016 | Apple Inc | Unit-selection text-to-speech synthesis based on predicted concatenation parameters |
9953088, | May 14 2012 | Apple Inc. | Crowd sourcing information to fulfill user requests |
9959870, | Dec 11 2008 | Apple Inc | Speech recognition involving a mobile device |
9966060, | Jun 07 2013 | Apple Inc. | System and method for user-specified pronunciation of words for speech synthesis and recognition |
9966065, | May 30 2014 | Apple Inc. | Multi-command single utterance input method |
9966068, | Jun 08 2013 | Apple Inc | Interpreting and acting upon commands that involve sharing information with remote devices |
9971774, | Sep 19 2012 | Apple Inc. | Voice-based media searching |
9972304, | Jun 03 2016 | Apple Inc | Privacy preserving distributed evaluation framework for embedded personalized systems |
9986419, | Sep 30 2014 | Apple Inc. | Social reminders |
Patent | Priority | Assignee | Title |
5021771, | Aug 09 1988 | Computer input device with two cursor positioning spheres | |
5172102, | Mar 16 1990 | Hitachi, Ltd. | Graphic display method |
5228124, | Mar 22 1989 | Mutoh Industries, Ltd. | Coordinate reader |
5263134, | Oct 25 1989 | Apple Inc | Method and apparatus for controlling computer displays by using a two dimensional scroll palette |
5313229, | Feb 05 1993 | GILLIGAN, FEDERICO GUSTAVO | Mouse and method for concurrent cursor position and scrolling control |
5374942, | Feb 05 1993 | GILLIGAN, FEDERICO GUSTAVO | Mouse and method for concurrent cursor position and scrolling control |
5446481, | Oct 11 1991 | Silicon Valley Bank | Multidimensional hybrid mouse for computers |
5473344, | Jan 06 1994 | Microsoft Technology Licensing, LLC | 3-D cursor positioning device |
5512892, | Feb 25 1994 | IBM Corporation | Hand held control device |
5530455, | Aug 10 1994 | KYE SYSTEMS AMERICA CORPORATION | Roller mouse for implementing scrolling in windows applications |
5586243, | Apr 15 1994 | International Business Machines Corporation | Multiple display pointers for computer graphical user interfaces |
5666499, | Aug 04 1995 | AUTODESK CANADA CO | Clickaround tool-based graphical interface with two cursors |
5691748, | Apr 02 1994 | WACOM CO , LTD | Computer system having multi-device input system |
5694150, | Sep 21 1995 | ELO TOUCH SOLUTIONS, INC | Multiuser/multi pointing device graphical user interface system |
5731801, | Mar 31 1994 | WACOM CO , LTD | Two-handed method of displaying information on a computer display |
EP16395, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 30 1996 | International Business Machines Corporation | (assignment on the face of the patent) | / | |||
Oct 25 1996 | LEE, BOBBY CHRISTOPHER | International Business Machines Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 008256 | /0722 | |
May 20 2005 | International Business Machines Corporation | LENOVO SINGAPORE PTE LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 016891 | /0507 |
Date | Maintenance Fee Events |
Jun 26 2003 | REM: Maintenance Fee Reminder Mailed. |
Dec 08 2003 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
Dec 07 2002 | 4 years fee payment window open |
Jun 07 2003 | 6 months grace period start (w surcharge) |
Dec 07 2003 | patent expiry (for year 4) |
Dec 07 2005 | 2 years to revive unintentionally abandoned end. (for year 4) |
Dec 07 2006 | 8 years fee payment window open |
Jun 07 2007 | 6 months grace period start (w surcharge) |
Dec 07 2007 | patent expiry (for year 8) |
Dec 07 2009 | 2 years to revive unintentionally abandoned end. (for year 8) |
Dec 07 2010 | 12 years fee payment window open |
Jun 07 2011 | 6 months grace period start (w surcharge) |
Dec 07 2011 | patent expiry (for year 12) |
Dec 07 2013 | 2 years to revive unintentionally abandoned end. (for year 12) |