A user interface of a smart compact device employs a combination comprising a display screen in combination with an activating object that is capable of reporting X and Y position information in a first state and a second state. Selected data is displayed or hidden depending on whether the first state or second state is detected.
|
1. A user interface of a smart compact device comprising:
a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state;
means configured to display data on said display screen responsive to said activating object being disposed in said proximate non-touching state for a selected time period; and
means configured to hide at least a portion of said data responsive to said activating object being disposed in said touching state.
28. A method for presenting and manipulating information in a user interface of a smart compact device, comprising a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state, the method comprising the steps of:
determining if said activating object is in said proximate non-touching state; displaying data responsive to said activating object being disposed in said proximate non-touching state for a selected time period; and
controlling said data on said display screen to hide second data responsive to said activating object being disposed in said proximate non-touching state.
59. A method for presenting and manipulating information in a user interface of a smart compact device comprising a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state, the method comprising the steps of:
determining if said activating object is in said proximate non-touching state;
displaying data responsive to said activating object being disposed in said proximate non-touching state for a selected time period;
determining if said activating object is in said touching state; and
hiding at least a portion of said data if said activating object is in said touching state.
48. A method for controlling a display of data on a user interface presented on a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a first proximate non-touching state and a second proximate non-touching state; defined by a first and second proximity relationship between said activating object and said display screen, respectively, the method comprising the steps of:
sensing a sensed relationship between said activating object and said display screen;
determining if said sensed relationship is said first proximity relationship;
displaying a first group of data on said display screen following a selected time period if said sensed relationship is said first proximity relationship;
determining if said sensed relationship is said second proximity relationship;
displaying a second group of data on said display screen if said sensed relationship is said second proximity relationship; and
controlling said display screen to hide at least a portion of said second group of data responsive to said activating object being disposed in said first proximity relationship.
2. The user interface of
6. The user interface of
8. The user interface of
10. The user interface of
11. The user interface of
12. The user interface of
17. The user interface of
0. 22. The user interface of
0. 23. The user interface of
0. 24. The user interface of
25. The user interface of
26. The user interface of
27. The user interface of
29. The method of
33. The method of
34. The method of
35. The method of
37. The method of
38. The method of
39. The method of
40. The method of
determining if said activating object is in said touching state; and
hiding at least a portion of said data if said activating object is in said touching state.
45. The method of
46. The method of
47. The method of
49. The method of
50. The method of
51. The method of
52. The method of
53. The method of
54. The method of
55. The method of
56. The method of
57. The method of
58. The method of
|
The invention relates generally to smart compact devices, and specifically to the display and manipulation of information on such devices by the use of 3pen or capacitive input system for use with a pen, finger, or the like, comprising a pen sensor 18 and a pen control system 20, such as generally analogous to that described in publication WO 00/33244, entitled “Position Sensor”. This system has the key advantage that it can provide X and Y position data both when the activating object is touching the display screen and when it is held in close proximity to, but not touching (i.e., hovering over), the display screen, and is able to distinguish between the two states. Alternatively, the first state may require a firm touch while the second “nontouching” state may occur even when the activating object is lightly touching the display screen.
In an alternative embodiment, such an enhanced pointing device could also be a capacitive touchpad, capable of sensing an activating object, such as a finger, pen, and the like, in close proximity to its surface, as well as when touched to the surface.
In a further alternative embodiment, the enhanced pointing device could be any device capable of providing X and Y position data, and two or more states associated with that data, those states being a base state (corresponding to a mouse pointer position with no buttons pressed, or an activating object in close proximity to, but not touching a display screen), a selected state (corresponding to a mouse pointer position with the left button clicked, or an activating object touched to the display screen of a smart compact device), and zero or more further extended states (corresponding, for example, to a mouse with the right button clicked, or an activating object with a side button held in).
In a further alternative embodiment, the enhanced pointing device is an inductive pen input system (or inductive sensing system) that is capable of providing X and Y position data, a base state, a selected state, and zero or more extended states, where the extended states are related to the distance of the activating object from the sensor.
In a further alternative embodiment, the enhanced pointing device is a device capable of providing X and Y position data, a base state, a selected state, and extended states, where the extended states are selected by a user-controlled parameter of the enhanced pointing device.
In an illustrative embodiment, the pen or finger control system 20 is driven by a pen driver 22, which can be software running on the microprocessor of the smart compact device. This pen driver 22 converts the pen input data 34 from the sensor system into pen input object position messages 36, formatted according to the requirements of the operating system 24. These messages 36 contain both the X and Y position data, and flags to signal which one of the states the pen system is reporting.
In an alternative embodiment, the software running on the smart compact device may be structured in a manner other than that shown in
In an illustrative embodiment, the operating system 24 processes the pen or finger position message, and alters the visible state of any appropriate screen object according to the state of the activating object as reported in the pen position message. If the activating object has been held above a screen object for more than a short length of time (for example, a short length of time being typically 1 second or less), the operating system will change the data displayed on the screen so that the pop-up data is displayed.
In an alternative embodiment, the pop-up data can be triggered immediately when the activating object is held over the screen object. Such an embodiment can be used, for example, to enable a user to easily dial the correct number from the contacts directory of a smartphone or similarly enabled portable device.
In an alternative embodiment, the selected data is displayed or hidden depending on whether the first state or the second state is detected. Information on the display screen is controlled in response to the presence of an activating object. At least a portion of the information can be selectively hidden and/or displayed in response to whether the activating object is in the first state or the second state.
FIG. 3B is an alternative embodiment generally analogous to FIG. 3A, except that FIG. 3B illustrates the invention in the context of a capacitive touchpad capable of sensing a finger in close proximity to its surface, as well as when touched to the surface. In FIG. 3B the enhanced pointing device is a capacitive finger sensor 18B as opposed to the pen sensor 18 of FIG. 3A. As in FIG. 3A, the control system 20B can provide X and Y position data both when the finger is touching the display screen and when it is held in close proximity to, but not touching, the display screen, and is able to distinguish between the two states. Alternatively, the first state may require a firm touch while the second “nontouching” state may occur even when the finger is lightly touching the display screen.
With continued reference to FIG. 3B, the control system 20B is driven by a driver 22B, which can be software running on the microprocessor of the touchpad. This driver 22B outputs finger position messages 36B, formatted according to the requirements of the operating system 24B. These messages 36B contain both the X and Y position data, and flags to signal which one of the states the system is reporting.
In an alternative embodiment, the software running on the capacitive touchpad may be structured in a manner other than that shown in FIG. 3B, as long as it is one that reports finger position to the appropriate component of the touchpad device, such as the operating system or the user interface.
In the embodiment illustrated in FIG. 3B, the operating system 24B processes the finger position message, and alters the visible state of any appropriate screen object according to the state of the finger as reported in the finger position message 36B. If the finger has been held above a screen object for more than a short length of time (for example, a short length of time being typically 1 second or less), the operating system will change the data displayed on the screen so that the pop-up data is displayed.
In an alternative embodiment, the pop-up data can be triggered immediately when the finger is held over the screen object. Such an embodiment can be used, for example, to enable a user to easily dial the correct number from the contacts directory of a smartphone or similarly enabled portable device.
In an alternative embodiment, the selected data is displayed or hidden depending on whether the first state or the second state is detected. Information on the display screen is controlled in response to the presence of a finger. At least a portion of the information can be selectively hidden and/or displayed in response to whether the finger is in the first state or the second state.
The data displayed on this screen is a summary or abstraction of the data held in the smart compact device about each contact. Typically, each contact will have at least the data illustrated in
In the prior art smart compact devices that use a resistive touch screen as a pointing device, the process of dialing a contact from the directory would typically involve the steps shown in
In an alternative prior-art embodiment, tapping a dial icon in the directory listing (reference numeral 74 in
In the present invention, the process is simplified for the user.
In the case of ambiguous names, the process of
In an alternative embodiment, the removal of the pop-up data may be signaled by another user action, such as the movement of the activating object outside the area of the display screen occupied by the pop-up data.
It will be recognized by those skilled in the art that the present invention is not limited to use with a contacts directory, but can be used for any application that requires an abstracted display of detailed data, and ready access to the detailed data.
In an alternative embodiment, the pop-up data does not have to take up the whole display screen of the smart compact device, but can be displayed on a part of the display.
In a further alternative embodiment, the directory listing takes the form of
It is a further advantage of this embodiment that movement of the activating object to the upper or lower extremes of the directory list (Reference numeral 92 of
It will be realized by those skilled in the art, that the present invention applies to all classes of screen objects, whether they are used in the software application user interface, as part of a the operating system user interface, or in a smart compact device where the software is not structured in this way, but contains a user interface and an enhanced pointing device. It will also be appreciated that the novel use of the combination of an enhanced pointing device and pop-up data greatly enhances the usability of the user interface on a smart compact device, in a way that has not previously been demonstrated.
Alternatively, the behavior of the screen object may be modified by the state information returned by the enhanced pointing device with the X and Y position data, so that more than one control action maybe produced from one control object.
Alternatively, the pop-up data for a screen object may be triggered by a further extended state of the enhanced pointing device, distinct from the base and selected states.
In an illustrative embodiment, the exact method of triggering the pop-up data, the duration of any timings associated with triggering the pop-up data, the range from the display screen that is defined as proximity and the form of any pop-up data will be determined by usability testing so as to maximize the benefit to the user in a given application.
While embodiments and applications of this invention have been shown and described, it would be apparent to those skilled in the art that many more modifications than mentioned above are possible without departing from the inventive concepts herein. The invention, therefore, is not to be restricted except in the spirit of the appended claims.
A user interface of a smart compact device is provided which includes: a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state; means configured to display data on the display screen responsive to the activating object being disposed in the proximate non-touching state for a selected time period; and means configured to hide at least a portion of the data responsive to the activating object being disposed in the touching state. The selected time period may be of a duration that ends substantially immediately after the activating object is disposed in the proximate non-touching state, wherein the selected time period may be a short length of time such as approximately one second.
The display screen may include an inductive sensing system, and the user interface may employ a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.
In an embodiment, the smart compact device displays the data in a window.
In another embodiment, once the data has been displayed, touching the activating object to the display screen substantially on the data causes a first action to occur, the first action being different from a second action that would have occurred if the data had not been displayed.
In other embodiments, the smart compact device may be a handheld device, the activating object is selected from the group consisting of a finger and a pen, and the display screen is adapted to communicate with an activating object disposed in at least one of the touching state, the proximate non-touching state, and a third state, wherein the third state is a second proximate non-touching state different from the proximate non-touching state.
In other embodiments, the data may include textual data, a graphic, a control object, and/or additional data.
In other embodiments, the smart compact device may be a personal digital assistant (PDA), a media player, a digital video device, a digital camera, and/or a mobile phone.
In other embodiments, the smart compact device executes a SYMBIAN operating system, a LINUX operating system, and/or a WINDOWS operating system.
In various embodiment, the hiding means may be configured to completely hide said data, hide the portion of the data by overlaying additional data over the portion of the data, and/or hide the portion of the data by restoring the portion of the display screen previously obscured by the data.
A method is provided for presenting and manipulating information in a user interface of a smart compact device including a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state. The method includes: determining if the activating object is in the proximate non-touching state; displaying data responsive to the activating object being disposed in the proximate non-touching state for a selected time period; and controlling the data on the display screen to hide second data responsive to the activating object being disposed in the proximate non-touching state.
In an embodiment, the selected time period is of length of time that ends substantially immediately after determining if the activating object is in the proximate non-touching state, wherein the selected time period may be a short length of time such as approximately one second.
In the method, the display screen may include an inductive sensing system, and the user interface may employ a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.
In an embodiment the smart compact device displays the data in a window.
In various embodiment, once the data has been displayed, touching the activating object to the display screen substantially on the data causes a first action to occur, the first action being different from a second action that would have occurred if the data had not been displayed.
In various embodiments, the smart compact device is a handheld device, the activating object is selected from the group consisting of a finger and a pen, and the display screen is adapted to communicate with an activating object disposed in at least one of the touching state, the proximate non-touching state, and a third state, wherein the third state is a second proximate non-touching state different from the proximate non-touching state.
The method may also involve determining if the activating object is in the touching state; and hiding at least a portion of the data if the activating object is in the touching state.
In various embodiments of the method, the data may include textual data, a graphic, a control object, and/or additional data.
In an embodiment, the controlling step may involve completely hiding the data, overlaying additional data over the portion of the data to thereby hide the data, and/or restoring the portion of the display screen previously obscured by the data to thereby hide the data.
A method is also provided for controlling a display of data on a user interface presented on a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a first proximate non-touching state and a second proximate non-touching state; defined by a first and second proximity relationship between the activating object and the display screen, respectively. The method includes sensing a sensed relationship between the activating object and the display screen; determining if the sensed relationship is the first proximity relationship; displaying a first group of data on the display screen following a selected time period if the sensed relationship is the first proximity relationship; determining if the sensed relationship is the second proximity relationship; displaying a second group of data on the display screen if the sensed relationship is the second proximity relationship; and controlling the display screen to hide at least a portion of the second group of data responsive to the activating object being disposed in the first proximity relationship.
In an embodiment, the first proximity relationship includes a first function related to a distance between the activating object and the display screen, and the second proximity relationship includes a second function related to the distance between the activating object and the display screen, the second function being different from the first function.
In an embodiment, sensing the sensed relationship occurs for a pre-selected period of time.
In an embodiment, the second group of data may be displayed a second selected time period after determining if the sensed relationship is the second proximity relationship.
In an embodiment, the second proximity relationship includes a user-controlled parameter of the user interface, wherein the user-controlled parameter may be defined by a switch coupled to the activating object, and wherein the switch may be a button.
In various embodiments, the controlling step may involve completely hiding said portion of said second group of data, overlaying additional data over the portion of the second group of data to thereby hide the second group of data, and/or restoring at least a portion of the first group of data obscured by the second group of data.
A method is also provided for presenting and manipulating information in a user interface of a smart compact device comprising a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state. The method involves the steps of: determining if the activating object is in the proximate non-touching state; displaying data responsive to the activating object being disposed in the proximate non-touching state for a selected time period; determining if the activating object is in the touching state; and hiding at least a portion of the data if the activating object is in the touching state.
Gillespie, David W., Foote, Geoffrey
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
4817034, | Feb 11 1986 | DAMILIC CORPORATION A MARYLAND CORPORATION | Computerized handwriting duplication system |
5149919, | Oct 31 1990 | International Business Machines Corporation | Stylus sensing system |
5347295, | Oct 31 1990 | THE CHASE MANHATTAN BANK, AS COLLATERAL AGENT | Control of a computer through a position-sensed stylus |
5805164, | Apr 29 1996 | Microsoft Technology Licensing, LLC | Data display and entry using a limited-area display panel |
5861583, | Jun 08 1992 | Synaptics Incorporated | Object position detector |
5923327, | Apr 23 1997 | BlackBerry Limited | Scrolling with automatic compression and expansion |
5995101, | Oct 29 1997 | Adobe Systems Incorporated | Multi-level tool tip |
6054979, | Aug 21 1996 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | Current sensing touchpad for computers and the like |
6133906, | Mar 15 1993 | 3M Innovative Properties Company | Display-integrated stylus detection system |
6236396, | May 27 1992 | Apple Inc | Method and apparatus for controlling a scheduler |
6424338, | Sep 30 1999 | Gateway, Inc. | Speed zone touchpad |
6429846, | Jun 23 1998 | Immersion Corporation | Haptic feedback for touchpads and other touch controls |
6483526, | Sep 24 1998 | International Business Machines Corporation | Multi-layer entry fields |
6486874, | Nov 06 2000 | Google Technology Holdings LLC | Method of pre-caching user interaction elements using input device position |
6492979, | Sep 07 1999 | ELO TOUCH SOLUTIONS, INC | Dual sensor touchscreen utilizing projective-capacitive and force touch sensors |
6563514, | Apr 13 2000 | EXTENSIO SOFTWARE, INC | System and method for providing contextual and dynamic information retrieval |
6587587, | May 20 1993 | Microsoft Corporation | System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings |
6614422, | Nov 04 1999 | Microsoft Technology Licensing, LLC | Method and apparatus for entering data using a virtual input device |
6674425, | Dec 10 1996 | Willow Design, Inc. | Integrated pointing and drawing graphics system for computers |
6930672, | Oct 19 1998 | Fujitsu Limited | Input processing method and input control apparatus |
20030122774, | |||
20030137522, | |||
20060004874, | |||
EP802476, | |||
EP996052, | |||
WO28407, | |||
WO33244, | |||
WO9954807, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jul 25 2001 | GILLESPIE, DAVID W | Synaptics Incorporated | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037352 | /0335 | |
Jul 27 2001 | FOOTE, GEOFFREY | Synaptics Incorporated | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037352 | /0335 | |
Oct 24 2013 | Wacom Co., Ltd. | (assignment on the face of the patent) | / | |||
Sep 27 2017 | SYNAPTICS INCORPROATED | Wells Fargo Bank, National Association | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 051316 | /0777 | |
Sep 27 2017 | Synaptics Incorporated | Wells Fargo Bank, National Association | CORRECTIVE ASSIGNMENT TO CORRECT THE CORRECT THE SPELLING OF THE ASSIGNOR NAME PREVIOUSLY RECORDED AT REEL: 051316 FRAME: 0777 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT | 052186 | /0756 | |
Aug 05 2019 | Wells Fargo Bank, National Association | Synaptics Incorporated | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 050206 | /0252 | |
Aug 27 2019 | Synaptics Incorporated | WACOM CO , LTD | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 050229 | /0145 |
Date | Maintenance Fee Events |
Date | Maintenance Schedule |
Oct 29 2022 | 4 years fee payment window open |
Apr 29 2023 | 6 months grace period start (w surcharge) |
Oct 29 2023 | patent expiry (for year 4) |
Oct 29 2025 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 29 2026 | 8 years fee payment window open |
Apr 29 2027 | 6 months grace period start (w surcharge) |
Oct 29 2027 | patent expiry (for year 8) |
Oct 29 2029 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 29 2030 | 12 years fee payment window open |
Apr 29 2031 | 6 months grace period start (w surcharge) |
Oct 29 2031 | patent expiry (for year 12) |
Oct 29 2033 | 2 years to revive unintentionally abandoned end. (for year 12) |