A user interface of a smart compact device employs a combination comprising a display screen in combination with an activating object that is capable of reporting X and Y position information in a first state and a second state. Selected data is displayed or hidden depending on whether the first state or second state is detected.

Patent
   RE47676
Priority
Jun 07 2001
Filed
Oct 24 2013
Issued
Oct 29 2019
Expiry
Jun 07 2021
Assg.orig
Entity
unknown
0
28
EXPIRED<2yrs
1. A user interface of a smart compact device comprising:
a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state;
means configured to display data on said display screen responsive to said activating object being disposed in said proximate non-touching state for a selected time period; and
means configured to hide at least a portion of said data responsive to said activating object being disposed in said touching state.
28. A method for presenting and manipulating information in a user interface of a smart compact device, comprising a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state, the method comprising the steps of:
determining if said activating object is in said proximate non-touching state; displaying data responsive to said activating object being disposed in said proximate non-touching state for a selected time period; and
controlling said data on said display screen to hide second data responsive to said activating object being disposed in said proximate non-touching state.
59. A method for presenting and manipulating information in a user interface of a smart compact device comprising a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state, the method comprising the steps of:
determining if said activating object is in said proximate non-touching state;
displaying data responsive to said activating object being disposed in said proximate non-touching state for a selected time period;
determining if said activating object is in said touching state; and
hiding at least a portion of said data if said activating object is in said touching state.
48. A method for controlling a display of data on a user interface presented on a display screen configured to report position information, said display screen adapted to communicate with an activating object disposed in at least one of a first proximate non-touching state and a second proximate non-touching state; defined by a first and second proximity relationship between said activating object and said display screen, respectively, the method comprising the steps of:
sensing a sensed relationship between said activating object and said display screen;
determining if said sensed relationship is said first proximity relationship;
displaying a first group of data on said display screen following a selected time period if said sensed relationship is said first proximity relationship;
determining if said sensed relationship is said second proximity relationship;
displaying a second group of data on said display screen if said sensed relationship is said second proximity relationship; and
controlling said display screen to hide at least a portion of said second group of data responsive to said activating object being disposed in said first proximity relationship.
2. The user interface of claim 1, wherein said selected time period is of a duration that ends substantially immediately after said activating object is disposed in said proximate non-touching state.
3. The user interface of claim 1, wherein said selected time period is a short length of time.
4. The user interface of claim 3, wherein said short length of time is approximately one second.
5. The user interface of claim 1, wherein said display screen includes an inductive sensing system.
6. The user interface of claim 1, wherein said user interface employs a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.
7. The user interface of claim 1, wherein said smart compact device displays said data in a window.
8. The user interface of claim 1, wherein once said data has been displayed, touching said activating object to said display screen substantially on said data causes a first action to occur, said first action being different from a second action that would have occurred if said data had not been displayed.
9. The user interface of claim 1, wherein said smart compact device is a handheld device.
10. The user interface of claim 1, wherein said activating object is selected from the group consisting of a finger and a pen.
11. The user interface of claim 1, wherein said display screen is adapted to communicate with an activating object disposed in at least one of said touching state, said proximate non-touching state, and a third state.
12. The user interface of claim 11, wherein said third state is a second proximate non-touching state different from said proximate non-touching state.
13. The user interface of claim 1, wherein said data includes textual data.
14. The user interface of claim 1, wherein said data includes a graphic.
15. The user interface of claim 1, wherein said data includes a control object.
16. The user interface of claim 1, wherein said data includes additional data.
17. The user interface of claim 1 wherein said smart compact device is a personal digital assistant (PDA).
18. The user interface of claim 1 wherein said smart compact device is a mobile phone.
19. The user interface of claim 1 wherein said smart compact device is a media player.
20. The user interface of claim 1 wherein said smart compact device is a digital camera.
21. The user interface of claim 1 wherein said smart compact device is a digital video device.
0. 22. The user interface of claim 1 wherein said smart compact device executes a SYMBIAN operating system.
0. 23. The user interface of claim 1 wherein said smart compact device executes a LINUX operating system.
0. 24. The user interface of claim 1 wherein said smart compact device executes a WINDOWS operating system.
25. The user interface of claim 1 wherein said hiding means is configured to completely hide said data.
26. The user interface of claim 1 wherein said hiding means is configured to hide said portion of said data by overlaying additional data over said portion of said data.
27. The user interface of claim 1 wherein said hiding means is configured to hide said portion of said data by restoring the portion of the display screen previously obscured by said data.
29. The method of claim 28, wherein said selected time period is of length of time that ends substantially immediately after said determining if said activating object is in said proximate non-touching state.
30. The method of claim 28, wherein said selected time period is a short length of time.
31. The method of claim 30, wherein said short length of time is approximately one second.
32. The method of claim 28, wherein said display screen includes an inductive sensing system.
33. The method of claim 28, wherein said user interface employs a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.
34. The method of claim 28, wherein said smart compact device displays said data in a windows window.
35. The method of claim 28, wherein once said data has been displayed, touching said activating object to said display screen substantially on said data causes a first action to occur, said first action being different from a second action that would have occurred if said data had not been displayed.
36. The method of claim 28, wherein maid said smart compact device is a handheld device.
37. The method of claim 28, wherein said activating object is selected from tho the group consisting of a finger and a pen.
38. The method of claim 28, wherein said display screen is adapted to communicate with an activating object disposed in at least one of said touching state, said proximate non-touching state, and a third state.
39. The method of claim 38, wherein said third state is a second proximate non-touching state different from said proximate non-touching state.
40. The method of claim 28, further comprising:
determining if said activating object is in said touching state; and
hiding at least a portion of said data if said activating object is in said touching state.
41. The method of claim 28, wherein said data includes textual data.
42. The method of claim 28, wherein said data includes a graphic.
43. The method of claim 28, wherein said data includes a control object.
44. The method of claim 28, wherein said data includes additional data.
45. The method of claim 28 wherein said controlling step further comprises completely hiding said data.
46. The method of claim 28 wherein said controlling step comprises overlaying additional data over said portion of said data to thereby hide said data.
47. The method of claim 28 wherein said controlling step comprises restoring the portion of the display screen previously obscured by said data to thereby hide said data.
49. The method of claim 48, wherein said first proximity relationship includes a first function related to a distance between said activating object and said display screen.
50. The method of claim 49, wherein said second proximity relationship includes a second function related to said distance between said activating object and said display screen, said second function being different from said first function.
51. The method of claim 48, wherein said sensing said sensed relationship occurs for a pre-selected period of time.
52. The method of claim 48, wherein said second group of data is displayed a second selected time period after said determining if said sensed relationship is said second proximity relationship.
53. The method of claim 48, wherein said second proximity relationship includes a user-controlled parameter of said user interface.
54. The method of claim 53, wherein said user-controlled parameter is defined by a switch coupled to said activating object.
55. The method of claim 54, wherein said switch is a button.
56. The method of claim 48 wherein said controlling step further comprises completely hiding said portion of said second group of data.
57. The method of claim 48 wherein said controlling step comprises overlaying additional data over said portion of said second group of data to thereby hide said second group of data.
58. The method of claim 48 wherein said controlling step comprises restoring at least a portion of the first group of data obscured by said second group of data.

The invention relates generally to smart compact devices, and specifically to the display and manipulation of information on such devices by the use of 3pen or capacitive input system for use with a pen, finger, or the like, comprising a pen sensor 18 and a pen control system 20, such as generally analogous to that described in publication WO 00/33244, entitled “Position Sensor”. This system has the key advantage that it can provide X and Y position data both when the activating object is touching the display screen and when it is held in close proximity to, but not touching (i.e., hovering over), the display screen, and is able to distinguish between the two states. Alternatively, the first state may require a firm touch while the second “nontouching” state may occur even when the activating object is lightly touching the display screen.

In an alternative embodiment, such an enhanced pointing device could also be a capacitive touchpad, capable of sensing an activating object, such as a finger, pen, and the like, in close proximity to its surface, as well as when touched to the surface.

In a further alternative embodiment, the enhanced pointing device could be any device capable of providing X and Y position data, and two or more states associated with that data, those states being a base state (corresponding to a mouse pointer position with no buttons pressed, or an activating object in close proximity to, but not touching a display screen), a selected state (corresponding to a mouse pointer position with the left button clicked, or an activating object touched to the display screen of a smart compact device), and zero or more further extended states (corresponding, for example, to a mouse with the right button clicked, or an activating object with a side button held in).

In a further alternative embodiment, the enhanced pointing device is an inductive pen input system (or inductive sensing system) that is capable of providing X and Y position data, a base state, a selected state, and zero or more extended states, where the extended states are related to the distance of the activating object from the sensor.

In a further alternative embodiment, the enhanced pointing device is a device capable of providing X and Y position data, a base state, a selected state, and extended states, where the extended states are selected by a user-controlled parameter of the enhanced pointing device.

In an illustrative embodiment, the pen or finger control system 20 is driven by a pen driver 22, which can be software running on the microprocessor of the smart compact device. This pen driver 22 converts the pen input data 34 from the sensor system into pen input object position messages 36, formatted according to the requirements of the operating system 24. These messages 36 contain both the X and Y position data, and flags to signal which one of the states the pen system is reporting.

In an alternative embodiment, the software running on the smart compact device may be structured in a manner other than that shown in FIG. 3, as long as it is one that reports the activating object's position to the appropriate component of the compact device, such as the operating system or the user interface.

In an illustrative embodiment, the operating system 24 processes the pen or finger position message, and alters the visible state of any appropriate screen object according to the state of the activating object as reported in the pen position message. If the activating object has been held above a screen object for more than a short length of time (for example, a short length of time being typically 1 second or less), the operating system will change the data displayed on the screen so that the pop-up data is displayed. FIG. 4 shows a typical screen object 50, and a hyperlink 54. FIG. 5 shows the same screen object 50, and its associated pop-up data 52 that has been triggered by holding the activating object above the hyperlink 54.

In an alternative embodiment, the pop-up data can be triggered immediately when the activating object is held over the screen object. Such an embodiment can be used, for example, to enable a user to easily dial the correct number from the contacts directory of a smartphone or similarly enabled portable device.

In an alternative embodiment, the selected data is displayed or hidden depending on whether the first state or the second state is detected. Information on the display screen is controlled in response to the presence of an activating object. At least a portion of the information can be selectively hidden and/or displayed in response to whether the activating object is in the first state or the second state.

FIG. 3B is an alternative embodiment generally analogous to FIG. 3A, except that FIG. 3B illustrates the invention in the context of a capacitive touchpad capable of sensing a finger in close proximity to its surface, as well as when touched to the surface. In FIG. 3B the enhanced pointing device is a capacitive finger sensor 18B as opposed to the pen sensor 18 of FIG. 3A. As in FIG. 3A, the control system 20B can provide X and Y position data both when the finger is touching the display screen and when it is held in close proximity to, but not touching, the display screen, and is able to distinguish between the two states. Alternatively, the first state may require a firm touch while the second “nontouching” state may occur even when the finger is lightly touching the display screen.

With continued reference to FIG. 3B, the control system 20B is driven by a driver 22B, which can be software running on the microprocessor of the touchpad. This driver 22B outputs finger position messages 36B, formatted according to the requirements of the operating system 24B. These messages 36B contain both the X and Y position data, and flags to signal which one of the states the system is reporting.

In an alternative embodiment, the software running on the capacitive touchpad may be structured in a manner other than that shown in FIG. 3B, as long as it is one that reports finger position to the appropriate component of the touchpad device, such as the operating system or the user interface.

In the embodiment illustrated in FIG. 3B, the operating system 24B processes the finger position message, and alters the visible state of any appropriate screen object according to the state of the finger as reported in the finger position message 36B. If the finger has been held above a screen object for more than a short length of time (for example, a short length of time being typically 1 second or less), the operating system will change the data displayed on the screen so that the pop-up data is displayed.

In an alternative embodiment, the pop-up data can be triggered immediately when the finger is held over the screen object. Such an embodiment can be used, for example, to enable a user to easily dial the correct number from the contacts directory of a smartphone or similarly enabled portable device.

In an alternative embodiment, the selected data is displayed or hidden depending on whether the first state or the second state is detected. Information on the display screen is controlled in response to the presence of a finger. At least a portion of the information can be selectively hidden and/or displayed in response to whether the finger is in the first state or the second state.

FIG. 6 shows a typical small display screen on a smartphone. The display screen 60 as depicted in FIG. 6 shows device status information 62,64 and a window 66 containing directory information. Within the window 66, the directory information is shown as lines 68, each containing screen objects, such as the contact name 70, the contact company 72 and an icon 74 intended to trigger the dialing process. An alternative embodiment to this screen is shown in FIG. 7, where the contact names 70 do not have a dial icon associated with them.

The data displayed on this screen is a summary or abstraction of the data held in the smart compact device about each contact. Typically, each contact will have at least the data illustrated in FIG. 8 stored. The window 66 in FIG. 8 is displaying both detailed contact information 80, and control objects in the form of icons, which allow dialing the contact 86 and 88, sending email 84 or returning the display to the directory listing 82 of FIG. 7.

In the prior art smart compact devices that use a resistive touch screen as a pointing device, the process of dialing a contact from the directory would typically involve the steps shown in FIG. 9. The user would select the display of the directory, tap the desired contact name 70, and then tap the dial icon 88 in the detailed display screen. This involves three discrete user actions.

In an alternative prior-art embodiment, tapping a dial icon in the directory listing (reference numeral 74 in FIG. 6) would dial a number directly. This has two key disadvantages. Firstly, there will usually be more than one number or contact method associated with each contact, so accessing a detailed screen would still be necessary. Secondly, the small amount of abstracted data visible on the directory screen is not always sufficient to uniquely identify the desired contact.

FIG. 10 illustrates a typical prior-art process for dialing a number when the contact name may be ambiguous. If the desired contact 70 has the same first name and company as another contact 71, the wrong contact may be selected initially. Then the user has to click the return icon 82 to return to the directory listing (or perform a similar ‘undo’ action). Then the correct desired contact 70 must be selected, followed by the selection of the dial icon 88 of the correct contact. This process takes five discrete user actions.

In the present invention, the process is simplified for the user. FIG. 11 shows a process according to the present invention. The contacts directory is selected, the activating object is held stationary above the desired contact name 70, which triggers the pop-up data and displays the detailed contact information. As part of the same movement the user taps the dial icon 88 to initiate the dialing sequence. This process involves only two discrete user actions, together with a natural, intuitive movement of the activating object.

In the case of ambiguous names, the process of FIG. 12 is followed. Once the contacts directory is selected, the activating object is held above the incorrect contact name 71, triggering the pop-up data with the full contact details. Instead of having to tap the return icon 82, the user simply moves the activating object out of the proximity range, and then holds it over the correct contact name 70. The pop-up data shows the detailed information, and the user can tap on the dial icon 88, initiating dialing. The complete process takes only three discrete user actions, together with a natural intuitive movement of the activating object.

In an alternative embodiment, the removal of the pop-up data may be signaled by another user action, such as the movement of the activating object outside the area of the display screen occupied by the pop-up data.

It will be recognized by those skilled in the art that the present invention is not limited to use with a contacts directory, but can be used for any application that requires an abstracted display of detailed data, and ready access to the detailed data.

In an alternative embodiment, the pop-up data does not have to take up the whole display screen of the smart compact device, but can be displayed on a part of the display. FIG. 13 illustrates such a display. The activating object is held above the desired contact name 70, which is then highlighted, triggering the pop-up data containing contact details 90 in the lower section of the screen. Moving the activating object to another contact name will cause the pop-up data to change to details of the new contact, and dialing the contact can be initiated by tapping the dial icon 88. This embodiment has the advantage that the context of the contact name can be seen together with the contact details. In this embodiment, the popup data is not removed when the activating object is moved away from the display screen, but remains until an action icon has been selected, pop-up data has been triggered from another contact name, or another application is selected on the smart compact device.

In a further alternative embodiment, the directory listing takes the form of FIG. 14. Here, the display screen in the directory application always has a listing 92, and pop-up data 90 associated with the highlighted contact 70. Holding and moving the activating object in proximity to the display above the listing causes the highlight to move, changing the pop-up data to match the highlighted contact. This allows the user to ensure that the right contact is selected from several ambiguous contacts. Clicking, tapping, or touching the activating object on the highlighted contact brings up the full information screen of FIG. 8, allowing the user to select the required contact method. This embodiment addresses both key disadvantages of the prior art, while being easy and intuitive for the user.

It is a further advantage of this embodiment that movement of the activating object to the upper or lower extremes of the directory list (Reference numeral 92 of FIG. 14) can cause the list to scroll automatically.

It will be realized by those skilled in the art, that the present invention applies to all classes of screen objects, whether they are used in the software application user interface, as part of a the operating system user interface, or in a smart compact device where the software is not structured in this way, but contains a user interface and an enhanced pointing device. It will also be appreciated that the novel use of the combination of an enhanced pointing device and pop-up data greatly enhances the usability of the user interface on a smart compact device, in a way that has not previously been demonstrated.

Alternatively, the behavior of the screen object may be modified by the state information returned by the enhanced pointing device with the X and Y position data, so that more than one control action maybe produced from one control object.

Alternatively, the pop-up data for a screen object may be triggered by a further extended state of the enhanced pointing device, distinct from the base and selected states.

In an illustrative embodiment, the exact method of triggering the pop-up data, the duration of any timings associated with triggering the pop-up data, the range from the display screen that is defined as proximity and the form of any pop-up data will be determined by usability testing so as to maximize the benefit to the user in a given application.

While embodiments and applications of this invention have been shown and described, it would be apparent to those skilled in the art that many more modifications than mentioned above are possible without departing from the inventive concepts herein. The invention, therefore, is not to be restricted except in the spirit of the appended claims.

A user interface of a smart compact device is provided which includes: a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state; means configured to display data on the display screen responsive to the activating object being disposed in the proximate non-touching state for a selected time period; and means configured to hide at least a portion of the data responsive to the activating object being disposed in the touching state. The selected time period may be of a duration that ends substantially immediately after the activating object is disposed in the proximate non-touching state, wherein the selected time period may be a short length of time such as approximately one second.

The display screen may include an inductive sensing system, and the user interface may employ a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.

In an embodiment, the smart compact device displays the data in a window.

In another embodiment, once the data has been displayed, touching the activating object to the display screen substantially on the data causes a first action to occur, the first action being different from a second action that would have occurred if the data had not been displayed.

In other embodiments, the smart compact device may be a handheld device, the activating object is selected from the group consisting of a finger and a pen, and the display screen is adapted to communicate with an activating object disposed in at least one of the touching state, the proximate non-touching state, and a third state, wherein the third state is a second proximate non-touching state different from the proximate non-touching state.

In other embodiments, the data may include textual data, a graphic, a control object, and/or additional data.

In other embodiments, the smart compact device may be a personal digital assistant (PDA), a media player, a digital video device, a digital camera, and/or a mobile phone.

In other embodiments, the smart compact device executes a SYMBIAN operating system, a LINUX operating system, and/or a WINDOWS operating system.

In various embodiment, the hiding means may be configured to completely hide said data, hide the portion of the data by overlaying additional data over the portion of the data, and/or hide the portion of the data by restoring the portion of the display screen previously obscured by the data.

A method is provided for presenting and manipulating information in a user interface of a smart compact device including a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state. The method includes: determining if the activating object is in the proximate non-touching state; displaying data responsive to the activating object being disposed in the proximate non-touching state for a selected time period; and controlling the data on the display screen to hide second data responsive to the activating object being disposed in the proximate non-touching state.

In an embodiment, the selected time period is of length of time that ends substantially immediately after determining if the activating object is in the proximate non-touching state, wherein the selected time period may be a short length of time such as approximately one second.

In the method, the display screen may include an inductive sensing system, and the user interface may employ a ratiometric measurement technique with a plurality of coils that each extend across a sensing area.

In an embodiment the smart compact device displays the data in a window.

In various embodiment, once the data has been displayed, touching the activating object to the display screen substantially on the data causes a first action to occur, the first action being different from a second action that would have occurred if the data had not been displayed.

In various embodiments, the smart compact device is a handheld device, the activating object is selected from the group consisting of a finger and a pen, and the display screen is adapted to communicate with an activating object disposed in at least one of the touching state, the proximate non-touching state, and a third state, wherein the third state is a second proximate non-touching state different from the proximate non-touching state.

The method may also involve determining if the activating object is in the touching state; and hiding at least a portion of the data if the activating object is in the touching state.

In various embodiments of the method, the data may include textual data, a graphic, a control object, and/or additional data.

In an embodiment, the controlling step may involve completely hiding the data, overlaying additional data over the portion of the data to thereby hide the data, and/or restoring the portion of the display screen previously obscured by the data to thereby hide the data.

A method is also provided for controlling a display of data on a user interface presented on a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a first proximate non-touching state and a second proximate non-touching state; defined by a first and second proximity relationship between the activating object and the display screen, respectively. The method includes sensing a sensed relationship between the activating object and the display screen; determining if the sensed relationship is the first proximity relationship; displaying a first group of data on the display screen following a selected time period if the sensed relationship is the first proximity relationship; determining if the sensed relationship is the second proximity relationship; displaying a second group of data on the display screen if the sensed relationship is the second proximity relationship; and controlling the display screen to hide at least a portion of the second group of data responsive to the activating object being disposed in the first proximity relationship.

In an embodiment, the first proximity relationship includes a first function related to a distance between the activating object and the display screen, and the second proximity relationship includes a second function related to the distance between the activating object and the display screen, the second function being different from the first function.

In an embodiment, sensing the sensed relationship occurs for a pre-selected period of time.

In an embodiment, the second group of data may be displayed a second selected time period after determining if the sensed relationship is the second proximity relationship.

In an embodiment, the second proximity relationship includes a user-controlled parameter of the user interface, wherein the user-controlled parameter may be defined by a switch coupled to the activating object, and wherein the switch may be a button.

In various embodiments, the controlling step may involve completely hiding said portion of said second group of data, overlaying additional data over the portion of the second group of data to thereby hide the second group of data, and/or restoring at least a portion of the first group of data obscured by the second group of data.

A method is also provided for presenting and manipulating information in a user interface of a smart compact device comprising a display screen configured to report position information, the display screen adapted to communicate with an activating object disposed in at least one of a touching state and a proximate non-touching state. The method involves the steps of: determining if the activating object is in the proximate non-touching state; displaying data responsive to the activating object being disposed in the proximate non-touching state for a selected time period; determining if the activating object is in the touching state; and hiding at least a portion of the data if the activating object is in the touching state.

Gillespie, David W., Foote, Geoffrey

Patent Priority Assignee Title
Patent Priority Assignee Title
4817034, Feb 11 1986 DAMILIC CORPORATION A MARYLAND CORPORATION Computerized handwriting duplication system
5149919, Oct 31 1990 International Business Machines Corporation Stylus sensing system
5347295, Oct 31 1990 THE CHASE MANHATTAN BANK, AS COLLATERAL AGENT Control of a computer through a position-sensed stylus
5805164, Apr 29 1996 Microsoft Technology Licensing, LLC Data display and entry using a limited-area display panel
5861583, Jun 08 1992 Synaptics Incorporated Object position detector
5923327, Apr 23 1997 BlackBerry Limited Scrolling with automatic compression and expansion
5995101, Oct 29 1997 Adobe Systems Incorporated Multi-level tool tip
6054979, Aug 21 1996 HEWLETT-PACKARD DEVELOPMENT COMPANY, L P Current sensing touchpad for computers and the like
6133906, Mar 15 1993 3M Innovative Properties Company Display-integrated stylus detection system
6236396, May 27 1992 Apple Inc Method and apparatus for controlling a scheduler
6424338, Sep 30 1999 Gateway, Inc. Speed zone touchpad
6429846, Jun 23 1998 Immersion Corporation Haptic feedback for touchpads and other touch controls
6483526, Sep 24 1998 International Business Machines Corporation Multi-layer entry fields
6486874, Nov 06 2000 Google Technology Holdings LLC Method of pre-caching user interaction elements using input device position
6492979, Sep 07 1999 ELO TOUCH SOLUTIONS, INC Dual sensor touchscreen utilizing projective-capacitive and force touch sensors
6563514, Apr 13 2000 EXTENSIO SOFTWARE, INC System and method for providing contextual and dynamic information retrieval
6587587, May 20 1993 Microsoft Corporation System and methods for spacing, storing and recognizing electronic representations of handwriting, printing and drawings
6614422, Nov 04 1999 Microsoft Technology Licensing, LLC Method and apparatus for entering data using a virtual input device
6674425, Dec 10 1996 Willow Design, Inc. Integrated pointing and drawing graphics system for computers
6930672, Oct 19 1998 Fujitsu Limited Input processing method and input control apparatus
20030122774,
20030137522,
20060004874,
EP802476,
EP996052,
WO28407,
WO33244,
WO9954807,
///////
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jul 25 2001GILLESPIE, DAVID W Synaptics IncorporatedASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0373520335 pdf
Jul 27 2001FOOTE, GEOFFREYSynaptics IncorporatedASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0373520335 pdf
Oct 24 2013Wacom Co., Ltd.(assignment on the face of the patent)
Sep 27 2017SYNAPTICS INCORPROATEDWells Fargo Bank, National AssociationSECURITY INTEREST SEE DOCUMENT FOR DETAILS 0513160777 pdf
Sep 27 2017Synaptics IncorporatedWells Fargo Bank, National AssociationCORRECTIVE ASSIGNMENT TO CORRECT THE CORRECT THE SPELLING OF THE ASSIGNOR NAME PREVIOUSLY RECORDED AT REEL: 051316 FRAME: 0777 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT 0521860756 pdf
Aug 05 2019Wells Fargo Bank, National AssociationSynaptics IncorporatedRELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS 0502060252 pdf
Aug 27 2019Synaptics IncorporatedWACOM CO , LTD ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0502290145 pdf
Date Maintenance Fee Events


Date Maintenance Schedule
Oct 29 20224 years fee payment window open
Apr 29 20236 months grace period start (w surcharge)
Oct 29 2023patent expiry (for year 4)
Oct 29 20252 years to revive unintentionally abandoned end. (for year 4)
Oct 29 20268 years fee payment window open
Apr 29 20276 months grace period start (w surcharge)
Oct 29 2027patent expiry (for year 8)
Oct 29 20292 years to revive unintentionally abandoned end. (for year 8)
Oct 29 203012 years fee payment window open
Apr 29 20316 months grace period start (w surcharge)
Oct 29 2031patent expiry (for year 12)
Oct 29 20332 years to revive unintentionally abandoned end. (for year 12)