A method, memory, and apparatus directing a computer system, having at least a processor, memory, and touchscreen, to create a hibernatable virtual pointing device. The method includes the steps of creating a virtual pointing device on the touchscreen under at least a first portion of a hand positioned in a first location on the touchscreen, whereby a hand behavior over the virtual pointing device causes a command to be invoked, in response to detecting the hand no longer being positioned on the touchscreen, placing the virtual pointing device in hibernation and, in response to detecting the hand being re-positioned at a second location on the touchscreen, moving the virtual pointing device to the second location and bringing the virtual pointing device out of hibernation, whereby the hand behavior over the virtual pointing device causes the command to be invoked.

Patent
   5933134
Priority
Jun 25 1996
Filed
Jun 25 1996
Issued
Aug 03 1999
Expiry
Jun 25 2016
Assg.orig
Entity
Large
222
16
EXPIRED
1. A method for directing a computer system, having at least a processor, memory and a touch screen, to create a hibernatable virtual pointing device, comprising the steps of:
creating a virtual pointing device within a memory displayed on the touchscreen under at least a first portion of a hand positioned in a first location on the touchscreen, whereby a hand behavior over the virtual pointing device causes a command to be invoked,
placing the virtual pointing device in a translucent hibernation state for a first specified period of time in response to detecting the hand no longer being positioned on the touchscreen;
gradually fading said virtual pointing device following said first specified period of time for a second specified period of time;
removing said virtual pointing device from memory following termination of said second specified period of time; and
moving the virtual pointing device to a second location and bringing the virtual pointing device out of the translucent hibernation state in response to detecting the hand being positioned at the second location on the touchscreen within said first specified period of time or within said second specified period of time, whereby hand behavior over the virtual pointing device causes a command to be invoked.
5. A computer system having at least a processor, memory and touchscreen, for creating a hibernatable virtual pointing device, said computer system comprising:
means for creating a virtual pointing device in memory displayed on the touchscreen displayed under at least a first portion of a hand positioned in a first location on the touchscreen, whereby hand behavior over the virtual pointing device causes a command to be invoked;
means for placing the virtual pointing device in a translucent hibernation state for a first specified period of time in response to detecting the hand no longer being positioned on the touchscreen;
means for gradually fading said virtual pointing device following said first specified period of time for a second specified period of time;
means for removing said virtual pointing device from memory following termination of said second specified period of time; and
means for moving the virtual pointing device to a second location and bringing the virtual pointing device out of hibernation in response to detecting the hand being repositioned in a second location on the touchscreen within said first specified period of time or within said second specified period of time, whereby hand behavior over the virtual pointing device causes a command to be invoked.
7. An article of manufacture, comprising:
a computer usable medium having computer readable program code means embodied therein, the computer readable code means in the article of manufacture comprising;
computer readable code means for causing a computer system having at least a touchscreen to create a virtual pointing device on the touchscreen under at least a first portion of a hand positioned in a first location on the touchscreen, whereby a hand behavior over the virtual pointing device causes a command to be invoked;
computer readable program code means for causing the computer system to place the virtual pointing device in a translucent hibernation state for a first specified period of time in response to detecting the hand no longer positioned on the touchscreen;
computer readable program code means for gradually fading said virtual pointing device following said first specified period of time for a second specified period of time;
computer readable program code means for removing said virtual pointing device from a memory following said termination of said second specified period of time; and
computer readable program code means for moving the virtual pointing device to a second location and bringing the virtual pointing device out of hibernation in response to detecting the hand being repositioned at the second location on the touchscreen within said first specified period of time or within said second specified period of time, whereby hand behavior over the virtual pointing device causes a command to be invoked.
2. The method according to claim 1 wherein said step of placing the virtual pointing device in a translucent hibernation state for a first specified period of time in response to detecting the hand no longer being positioned on the touchscreen further comprises the step of displaying at the first location at least one translucent area, wherein hand behavior on the translucent area fails to cause a command to be invoked.
3. The method according to claim 2 further comprising the step of failing to cause a command to be invoked when a single click of a finger is detected on the translucent area.
4. The method according to claim 2 further comprising the step of:
failing to cause the command to be invoked when a double click of a finger is detected on the translucent area.
6. The computer system according to claim 5 further comprising:
means for displaying at the first location at least on translucent area, wherein hand behavior fails to cause a command to be invoked.

The present invention appears to claim subject matter disclosed in prior co-pending application, Ser. No. 08/654,486, IBM Docket No. AT9-96-065, filed on May 28, 1996, co-pending application, Ser. No. 08/654,487, IBM Docket No. AT9-96-066, filed on May 28, 1996, co-pending application, Ser. No. 08/664,038, IBM Docket No. AT9-96-017, filed on May 28, 1996, co-pending application, Ser. No. 08/664,088, IBM Docket AT9-96-068, and co-pending application, Ser. No. 08/664,037, IBM Docket No. AT9-96-069, filed on Jun. 13, 1996, co-pending application, Ser. No. 08/664,036, IBM Docket No. AT9-96-070, filed on Jun. 13, 1996, co-pending application, Ser. No. 08/672,519, IBM Docket No. AT9-96-084, filed on Jun. 25, 1996, co-pending application, Ser. No. 08/672,518, IBM Docket AT9-96-083, and co-pending application, Ser. No. 08/672,520, IBM Docket No. AT9-96-086, filed on Jun. 25, 1996.

1. Field of the Invention

The present invention relates to pointing devices and pointers and, more particularly, but without limitation, to pointing devices for use on touchscreen systems.

2. Background Information and Description of the Related Art

Conventionally, users interface with the desktop and operating system of their computer system using a "mouse". A mouse is a special hardware input device connected by a wire or infrared signal to the computer system. Typically, the mouse has one or more push buttons on its top and a roller on its bottom designed to roll along a surface next to the computer system. When the user moves the mouse's roller on the surface, a mouse pointer positioned on the computer system's display tracks the movement of the mouse's roller. When the user has positioned the pointer at a desirable location, such as over an object, the user clicks or multiple clicks, depending on how the mouse is programmed, one of the mouse push buttons to invoke or open the object.

The user may customize the operations of a mouse and mouse pointer. Through a customization menu maintained by some conventional operating systems, the user may customize, for example, the assignment of a single click of a first mouse button to invoke a first function and the assignment of a single click over a second mouse button to invoke a second function. Further, the user may customize the appearance of the mouse pointer on the display screen. For example, one user may prefer a small arrow to be the mouse pointer, while another user may prefer a large blinking arrow. Similarly, some users may prefer a fast mouse pointer (i.e., small movements of the mouse's roller cause large movement of the mouse pointer), while other users may prefer a slower mouse pointer. This feature is referred to as the "sensitivity" of the mouse pointer.

These types of mouse and mouse pointer behaviors may be customized for each individual user. However, most operating systems provide customization for only one user on one system. Therefore, for a multi-user system, the user must re-customize the mouse from the prior setting. This typically involves the user editing a mouse settings file or local database that maps button behavior to a specific function. Some systems, such as X11, have a special init file to do this.

Conventional mice suffer certain disadvantages and limitations. For example, the mouse is bulky, fixed in size so that very small hands or very large hands alike do not properly fit over the mouse, not permanently attached to the computer system, subject to corrosion, and requires the user to know the specific mouse behavior (e.g., which mouse button and how many clicks invoke a function). Accordingly, many customer oriented systems (e.g., ATM machines) and multi-user systems do not use mice. Rather, the trend for multi-user systems is to use touchscreens.

Conventional touchscreens allow the user's finger or a pointing device to replace the conventional mouse and mouse pointer. Conventional touchscreens utilize, for example, heat sensitive, sound sensitive, pressure sensitive, or motion sensitive grids/detectors to detect a hand, finger, or object placed on the touchscreen. However, conventional touchscreens suffer certain limitations and disadvantages. For example, unlike a mouse pointer, fingers vary in size and, therefore, the desktop must place contiguous object icons and text far apart to accommodate the largest fingers. Also, the user cannot select the customization features as found in conventional mice and mouse pointers.

Accordingly, there would be great demand for a new pointing device that uses touchscreen technology, but allows object icons and text to be placed close to one another and allows user customization of the pointing device.

A method, memory, and apparatus directing a computer system, having at least a processor, memory, and touchscreen, to create a hibernatable virtual pointing device. The method includes the steps of creating a virtual pointing device on the touchscreen under at least a first portion of a hand positioned in a first location on the touchscreen, whereby a hand behavior over the virtual pointing device causes a command to be invoked, in response to detecting the hand no longer being positioned on the touchscreen, placing the virtual pointing device in hibernation and, in response to detecting the hand being re-positioned at a second location on the touchscreen, moving the virtual pointing device to the second location and bringing the virtual pointing device out of hibernation, whereby the hand behavior over the virtual pointing device causes the command to be invoked.

FIG. 1 illustrates a conventional hardware configuration for use with the present invention.

FIG. 2 illustrates a virtual pointing device in accordance with the present invention.

FIG. 3 illustrates detailed logic in the form of a flowchart for performing the steps in accordance with the present invention.

FIG. 4 illustrates a variation of the virtual pointing device illustrated in FIG. 2.

FIG. 5 illustrates another view of the virtual pointing device shown in FIG. 2.

FIG. 6 illustrates a menu for defining the characteristics of the virtual pointing device in accordance with the present invention.

FIG. 7 illustrates a shape menu, define functionality menu, and define pointer menu in accordance with the present invention.

FIG. 8 illustrates detailed logic in the form of a flowchart for performing the steps in accordance with the present invention.

FIG. 9 illustrates detailed logic in the form of a flowchart for performing the steps in accordance with the present invention.

The preferred embodiments may be practiced in any suitable hardware configuration that uses a touchscreen, such as computing system 100 illustrated in FIG. 1 or, alternatively, in a laptop or notepad computing system. Computing system 100 includes any suitable central processing unit 10, such as a standard microprocessor, and any number of other objects interconnected via system bus 12. For purposes of illustration, computing system 100 includes memory, such as read only memory (ROM) 16, random access memory (RAM) 14, and peripheral memory devices (e.g., disk or tape drives 20) connected to system bus 12 via I/O adapter 18. Computing system 100 further includes a touchscreen display adapter 36 for connecting system bus 12 to a conventional touchscreen display device 38. Also, user interface adapter 22 could connect system bus 12 to other user controls, such as keyboard 24, speaker 28, mouse 26, and a touchpad 32 (not shown).

One skilled in the art readily recognizes how conventional touchscreens operate, how conventional touchscreen device drivers communicate with an operating system, and how a user conventionally utilizes a touchscreen to initiate the manipulation of objects in a graphical user interface. For example, touchscreen technology includes electronic sensors positioned inside a flexible membrane covering a computer screen, a grid of infrared signals, or a method of detecting a touch by sensing a change in reflected sound waves through glass or plastic. Using current touchscreen technology, a user can initiate the display of a pull down menu by touching the touchscreen, and then selecting an object within that menu by dragging a finger down the pull down menu.

A graphical user interface (GUI) and operating system (OS) of the preferred embodiment reside within a computer-readable media and contain a touchscreen device driver that allows one or more users a user to initiate the manipulation of displayed object icons and text on a touchscreen display device. Any suitable computer-readable media may retain the GUI and operating system, such as ROM 16, RAM 14, disk and/or tape drive 20 (e.g., magnetic diskette, magnetic tape, CD-ROM, optical disk, or other suitable storage media).

In the preferred embodiments, the COSE™ (Common Operating System Environment) desktop GUI interfaces the user to the AIX™ operating system. The GUI may be viewed as being incorporated and embedded within the operating system. Alternatively, any suitable operating system or desktop environment could be utilized. Examples of other GUIs and/or operating systems include X11™ (X Windows) graphical user interface, Sun's Solaris™ operating system, and Microsoft's Windows 95™ operating system. While the GUI and operating system merely instruct and direct CPU 10, for ease in explanation, the GUI and operating system will be described as performing the following features and functions.

Referring to FIG. 2, touchscreen 200 includes any conventional, suitable touchscreen that is sensitive to, for example, heat, pressure, or the sound of palm and fingerprints. In this illustration, a user has placed his/her right hand (not shown) on touchscreen 200. While any suitable touchscreen technology may be used, for ease in explanation, the preferred embodiment will be described as using a touchscreen that detects sound patterns. In response to the user placing his/her hand on touchscreen 200, touchscreen 200 detects the sound pattern of the user's hand, including the sound from palmprint area 210, thumbprint area 215, fingerprint areas 220, 230, 235, and 240, and areas 280. Alternatively, only a portion of the hand (e.g., only fingers) and/or a unique object (e.g., stylus) could be substituted for the detection of a hand print. Moreover, more than one hand or object can be detected at a time.

When touchscreen 200 detects one or more hand/finger patterns similar to the one shown in FIG. 2, the OS attempts to identify the user(s). To do so, the OS measures the distance of each fingerprint area 215, 220, 230 and 240 from palmprint area 210, along with the X, Y coordinates of palmprint area 210 and the X, Y extremities of the palmprint area 210. The OS defines the cross point of the leftmost and uppermost point of the palmprint area 210 as the first reference point 255. The OS measures the longest distance from thumbprint 215 to the first reference point 255. Similarly, the OS measures the longest distance from fingerprint areas 220 and 230, respectively, to first reference point 255.

In the same manner, the OS defines the cross point of the rightmost and uppermost point of palmprint area 210 as the second reference point 260, whereby the longest distance from fingerprint area 240 to the second reference point 260 is determined. Finally, the OS measures the X and Y coordinates 265 and 270 of palmprint area 210. To add even more accuracy, the size of each fingerprint could be measured.

Next, the OS searches a user file database (not shown) stored in memory for a match of the newly determined measurements with any existing measurements to determine if a stored identity exists for the handprint. Specifically, the OS compares the four distance measurements and the X, Y coordinates of palmprint 210 with any existing measurements stored in the user file database. However, one skilled in the art realizes that numerous means exists for identifying the handprint (or object print) of a particular user (or user's object) without departing from the scope and spirit of this invention. Illustratively, only the width of the palmprint area 210 could be used to determine if a match existed.

If the OS finds a match within a user-defined (or default) acceptable tolerance (described herein), the OS reads the user file for pre-defined customization features, if any, and creates a virtual pointing device under the hand (or a portion of the hand) positioned on touchscreen 200 using the pre-defined customization features. Additionally, one skilled in the art recognizes that a secondary confirmation of the user match could be made through, for example, a user id label displayed next to the virtual pointing device, or a specific color shading of the virtual pointing device. Therefore, the areas of touchscreen 200 under, for example, the user's thumb (i.e., thumbprint area 215), fingers (i.e., fingerprint areas 220, 230, 235, and 240), and palm (i.e., palmprint area 210) become "activated", such that certain defined movements of the user's fingers, thumb, and/or palm on those "activated" areas cause certain functions to be invoked. However, if the OS does not recognize the handprint, the OS can build a default virtual pointing device under the hand or a portion of the hand using a default set of functions or the user can create a customized virtual pointing device (described herein).

FIG. 5 illustrates how the user(s) move and operate the virtual pointing device(s). As the user slides his/her hand over touchscreen 200 such that the hand remains in substantial contact with touchscreen 200, the OS detects the position of the user's moving hand on touchscreen 200 and, in response, continuously re-defines the "activated" areas of the virtual pointing device to be the areas under the hand (or a portion of the hand). Therefore, the virtual pointing device moves with and according to the movement of the user's hand. For example, if an "activated" area is initially defined as the area contained within the touchscreen pixel coordinates [X1, Y1, X2, Y2, X3, Y3, and X4, Y4] (not shown) and the user moves a finger from that area to the touchscreen pixel coordinates [X5, Y5, X6, Y6, X7, Y7, and X8, Y8], the "activated" area moves to those new coordinates.

The OS positions pointer 250 near an activated area of the virtual pointing device (in this case, over fingerprint area 230) such that pointer 250 moves in lock step with the virtual pointing device. Therefore, the user could, for example, move the virtual pointing device and, therefore, pointer 250, such that pointer 250 is positioned over a desired object icon. Alternatively, the user could merely lift his hand and place it at a desired location, whereby the OS would re-create the virtual pointing device under the user's hand at the new location (described herein).

The user operates the virtual pointing device via movement of the user's fingers, thumb and/or palm. Illustratively, the user may invoke the "focus function" 245, whereby an object icon positioned under pointer 250 gains focus, by lifting his/her thumb and then placing the thumb back on thumbprint area 215 within a certain amount of time (e.g., two seconds) (referred to as "single clicking"). Similarly, the user may invoke the "paste" function by lifting and replacing his/her third finger on third fingerprint area 235 within a certain amount of time.

Each finger, palm, and thumb behavior and associated functionality/command can be specially defined, and later redefined, to invoke a specific function (described in more detail herein). The OS displays a dialog above each fingerprint/thumbprint area to indicate the finger behavior (a "(1)" representing a single click; a "(2)" representing a double click, etc.) and corresponding functionality/command (e.g., focus 245, open 257, select until release 259, paste 261 and default menu 262).

The default functionality/command, finger behavior and pointer are defined in the preferred embodiment as follows. A single click of the thumb on thumbprint area 215 causes the OS to invoke focus function 245 on any object icon or text positioned under pointer 250. A single click of a finger on fingerprint area 220 or a double click of thumbprint area 215 causes the OS to invoke an open function 230 on any object icon or text positioned under pointer 250. A single click on fingerprint area 230 invokes a select until release function 259 on any object icon or text positioned under pointer 250, while a single click of fingerprint area 235 invokes a paste function 261 on any object icon or text positioned under pointer 250. Finally, a single click of fingerprint area 240 invokes a default menu function 263. The default pointer 250 is in the shape of an arrow and is positioned near fingerprint area 230. However, one skilled in the art readily recognizes that any combination of default functions, pointer location, and/or finger behavior (e.g., multiple clicks) could have been used to define the default virtual pointing device. Moreover, a simultaneous single click (or multiple clicks) of two or more fingers could invoke a function/command.

FIG. 3 illustrates a flow chart containing detailed logic for implementing the preferred embodiments. At 302, touchscreen 200 detects sound/heat/pressure, etc., from a handprint (or object), or alternatively, a portion of a handprint. At 306, the OS reads the handprint and calculates the measurements previously described and illustrated in FIG. 2. At 310, the OS searches user files in a database for the handprint measurements. At 312, if the OS locates any existing handprint measurements within a default tolerance of 10% (which can later be changed by the user, described herein), at 320, the OS reads all information in that user file and, at 322, draws a virtual pointing device on the touchscreen under the user's hand (or portion of the hand) based on pre-defined characteristics found in the user file. Additionally, in the future, if any objects and/or text have been selected by the virtual pointing device, they will be drawn in a position relative to their previous location to the virtual pointing device (described herein). At 333, the OS requests a confirmation that the user match is correct. If the user continues to use the virtual pointing device, then the OS interprets this use as a confirmation. The OS displays a user id label next to the virtual pointing device or, alternatively, may display the virtual pointing device as a specific color shading. Otherwise, if the wrong user has been assumed, control returns to 310, where the OS searches the database for another possible match.

At 324, the OS determines if there is any consistent unusual behavior or undefined behavior for four or more seconds, such as, for example, failing to detect the fingerprint(s), the palmprint, or no handprint on the touchscreen. If the OS detects no unusual behavior, the OS performs a work event loop at 326 (see FIG. 9) and control returns to 324. Referring to FIG. 9, at 902, the OS determines if any movement of the hand across the touchscreen has occurred and, if so, at 904 the OS moves the virtual pointing device in accordance with the movement of the hand. At 906, the OS determines if movement of a finger or thumb has occurred to invoke a function/command and, if so, at 908 the OS invokes that function/command on any object/text positioned under the pointer. Control returns to 324.

Returning to 324 of FIG. 3, if the OS detects unusual behavior or undefined behavior for a certain amount of time (e.g., 4 seconds), at 328, the OS determines if all fingers have been lifted off the touchscreen while the palm remains on the touchscreen. Alternatively, one skilled in the art recognizes that many other indicators could replace the "all fingers lifted" indicator, such as determining if a combination of fingers have been lifted or determining if the palm has been lifted while the fingerprints remain in contact with the touchscreen. If the OS determines that all fingers have been lifted off the touchscreen, at 330, the OS displays a main menu 600 (see FIG. 6, described herein) prompting the user to provide any re-customization of the virtual pointing device. At 344, the OS displays the new virtual pointing device in accordance with any changes made at 330 and control returns to 324.

Returning to 328, if all fingers were not detected as being raised while the palm remained in contact with the touchscreen, at 342, control is directed to FIG. 8. Referring to FIG. 8, at 810, the OS determines if the entire hand (or object) has been lifted off the touchscreen. If the entire hand has not been lifted off the touchscreen, but unusual or undefined behavior has occurred, such as lifting a combination of fingers, thumb and/or palm (whose behavior does not have a corresponding defined functionality), control is directed to 814, where the OS re-draws the virtual pointing device under the hand based on the user file. This indicates to the user that the immediate past hand/finger behavior has no defined function. If the entire hand has been lifted off the touchscreen, at 811, the OS continues to display the virtual pointing device on the touchscreen in its current location for a period of time (e.g., 5 seconds), but in an obvious hibernated state, meaning the fingerprint and palmprint areas will be viewed as translucent areas on the touchscreen. When the virtual pointing device is in the obviously hibernated state, no functionality can be invoked until it is activated (i.e., brought out of hibernation, described herein). At 812, the OS determines if the hand has been re-positioned on the touchscreen within five seconds of detecting its removal. If the hand has not been re-positioned on the touchscreen within the five seconds, control is directed to 826 (described herein). However, if the OS detects the hand being re-positioned on the touchscreen within 5 seconds, at 816, the OS determines if more than one virtual pointing device is concurrently being used and, if so, if more than one user had lifted his/her hand off the touchscreen at the time the hand was re-positioned on the touchscreen. If not, at 814, control is directed to 322 of FIG. 3, whereby the OS activates and moves the virtual pointing identified by the user file under the re-positioned hand. Additionally, if any objects and/or text were selected by the virtual pointing device at the time the hand was lifted, they will be re-drawn in a position relative to their previous location to the virtual pointing device (described herein).

If more than one user had concurrently lifted his/her hand off the touchscreen, at 820, the OS reads the handprint of the re-positioned hand and calculates the measurements previously described and illustrated in FIG. 2. At 822, the OS searches the user files of the virtual pointing devices having a detected lifted hand for a hand measurement match. If a match is not found, at 823, the OS searches the user file database for the user identification of one of the virtual pointing devices having a detected lifted hand. The OS then displays a dialog (not shown) asking the user if he/she is the user identified by the user identification. If the user indicates that he/she is identified by the user identification at 825, at 826, control is directed to 322 of FIG. 3, whereby the OS moves the virtual pointing device identified by the user file under the re-positioned hand, and if any objects and/or text were selected by the virtual pointing device, they will be re-drawn in a position relative to their previous location to the virtual pointing device (described herein). However, if the user indicates that the identification does not identify the user at 825, the OS determines if that identification is the last user file of a virtual pointing device having a detected lifted hand. If not, control returns to 823 where the OS searches the next user file of a virtual pointing device having a detected lifted hand. This process repeats until a match is found between the user and the user identification and, therefore, the corresponding virtual pointing device having a detected lifted hand. If the OS has searched the last user file and no match has been found, at 839, control is directed to 310 of FIG. 3, where the OS search all the user files for the user's hand.

Returning to 812, if the hand has not been repositioned on the touchscreen within 5 seconds, at 826, the OS continues to display the virtual pointing device in the obvious hibernated state and, at 828, prompts the user in a dialog (not shown) if the user desires to quit. If the user desires to quit, control is directed to 830 where the OS removes the virtual pointing device from the display. If the user does not desire to quit, at 832, the OS places the mouse in a "hidden hibernation" state, which means that the mouse image displayed on the touchscreen in the obvious hibernated state (i.e., translucent) begins to fade with time, but can be instantly activated when the user next touches the touchscreen. Therefore, the OS transforms the virtual pointing device from obvious hibernation (e.g., displayed in an translucent form) to hidden hibernation. After a user specified time (e.g., 30 minutes), the OS interprets the time delay as meaning that the virtual pointing device is no longer needed. At 836, if the OS detects a hand placed on the touchscreen within 30 minutes, at 840, the OS brings the virtual pointing device out of hidden hibernation, redraws it under the hand, and control returns to 324 of FIG. 3. Otherwise, at 838, the OS removes the virtual pointing device currently in a hidden hibernation state from memory (e.g., RAM).

Returning to 312 of FIG. 3, the OS determines if a match has been found between a measured hand placed on the touchscreen and any existing user files. If the OS detects several user files having handprint measurements closely matching the handprint in question, at 316, the OS displays in a drop down menu (not shown) on the touchscreen showing those users having the closest match. At 318, the OS waits for the user to select (using his other hand) from the drop down menu a match in user identity, or a selection indicating that no match has occurred. If a match has occurred, control is directed to 320 (previously described). If no match has occurred, control is directed to 314, where the OS displays on the touchscreen a menu (see 510 in FIG. 5) asking the user to indicate if he/she desires to create a customized virtual pointing device. If the user does not desire to create a customized virtual pointing device, the OS prompts the user to place his/her hand on the touchscreen and, in response, the OS builds a generic virtual pointing device under the user's hand, as shown in FIG. 5, having the default finger/palm behavior and fingerprint functionality as previously described and control is directed to 324.

If the user does desire to create a customized virtual pointing device, at 332, the OS opens a user file. At 334, the OS stores the size of the fingerprints and palmprint in the user file. At 336, the OS calculates the distance between the first reference point (previously described and shown in FIG. 2) and the farthest point to each fingerprint of the first three fingers. Additionally, the OS could calculate the second reference point and distance therefrom to the fourth fingerprint. At 338, the OS prompts the user for a user identification and displays main menu 600, which prompts the user to enter virtual pointing device characteristics, such as the virtual pointing device shape, pointer location, behavior and sensitivity, and fingerprint functionality (described herein and shown in FIG. 6). At 340, the OS stores all information in the user file. Control is directed to 322, where the OS draws the virtual pointing device under the hand (or portion of the hand) based on the information stored in the user file.

At 324, the OS determines if any unusual behavior has occurred. If so, at 328, the OS determines if all fingers of the hand have been lifted off the touchscreen. If so, at 330, the OS displays a main menu 600 as illustrated in FIG. 6, prompting the user to provide any customization of the virtual pointing device.

Referring to FIG. 6, after the OS displays the main menu 600, the user may remove his/her hand from the touchscreen. If the user selects shape button 620, a "shape" menu appears (see 700 in FIG. 7) that allows the user to define/redefine the shape of the virtual pointing device. Referring to shape menu 700 of FIG. 7, the OS displays several options to the user. For example, the user could select a "fingers only" virtual pointing device (see FIG. 4, described herein) whereby only the fingers need to be in contact with the touchscreen to move the virtual pointing device, or a palm and thumb only virtual pointing device, whereby only the thumb and palm need to be in contact with the touchscreen to move the virtual pointing device. In the latter case, movement of the fingers would not be assigned functionality. Additionally, "a thumb plus one finger" or "palm" virtual pointing device could be created. However, because the OS invokes the main menu 600 (see FIG. 6) by lifting all fingers while keeping the palm in contact with the touchscreen, if the user defines a new virtual pointing device that does not include the palm, the user could not later re-program the functionality of that special virtual pointing device. Rather, the user would have to start with a generic virtual pointing device to create a new device. Alternatively, a different technique could be used to activate the main menu 600 without departing from the scope of the invention.

The user may change the default accuracy tolerance amount from 10% to one of a number of pre-programmed values. To do so, the user presses accuracy button 702 and, in response, a drop-down list (not shown) of values (e.g., 4%, 8%, 20%) appears for the user's selection. The user enters/saves all selections by pressing button 704. In response, the main menu 600 shown in FIG. 6 reappears.

Returning to FIG. 6, if the user selects define function button 625, a "define function" menu appears that allows the user to define/redefine the functionality of the fingerprints/palmprint areas. Specifically, define functionality menu 730 in FIG. 7 allows the user to change the functionality of each fingerprint and thumbprint area by pressing the associated button next to the appropriate finger. For example, the user has pressed button 732, indicating that he/she desires to change the functionality of the second finger (i.e., fingerprint area 230). In response, the OS displays drop-down list 740 of pre-defined functions stored in memory. The user has selected open function 742 where, in response, the OS displays another drop-down list 746. The user selected a double click 744 of the second finger to invoke the open function. The user then presses save button 748 to save the entries in the user file. In response, the main menu 600 shown in FIG. 6 appears. However, one skilled in the art readily recognizes that other changes in finger behavior and fingerprint area functionality may be made without departing from the scope and spirit of this preferred embodiment.

Returning to FIG. 6, if the user selects define pointer button 630, a "define pointer" menu appears that allows the user to define/redefine the shape, sensitivity, and position of the pointer on the virtual pointing device. Referring to define pointer menu 760 in FIG. 7, the user has a number of choices regarding the pointer. For example, the user can select a small, medium or large arrow, and/or a blinking arrow. The user can also select small or large pointer sensitivity, and the position of the pointer with respect to the virtual pointing device. For example, the pointer may be positioned over the third finger (default position), over the first finger, or below the palm. However, one skilled in the art readily recognizes that numerous changes in pointer behavior may be made without departing from the scope and spirit of this preferred embodiment. The user presses save button 762 to save the entries and, in response, the main menu 600 appears.

Finally, in FIG. 6, the user has the option of saving and exiting by pressing save/exit button 635, or cancelling all changes and returning to the default virtual pointing device by pressing cancel button 615.

Referring to FIG. 4, in a second embodiment, the OS displays pre-determined, standard size fingerprint areas 420, 430, 435 and 440 and pointer 450 as a non-activated (also referred to as "obviously hibernated") virtual pointing device. The fingerprint areas of the virtual pointing device are translucent such that object icons can be seen through them. To activate the virtual pointing device, the user places one or more fingers over a fingerprint area 420, 430, 435 or 440 on touchscreen 400. Once activated, the OS assigns a default function (e.g., default function displayed above each fingerprint area) to each fingerprint area.

In response to detecting a finger placed over one or more of the fingerprint areas, the OS resizes the fingerprint area(s) to the size of the finger placed on the fingerprint area. Therefore, for example, if a finger is smaller than a fingerprint area, that fingerprint area will be reduced to the size of the finger. Conversely, if the finger is larger than the fingerprint area, the fingerprint area will be enlarged to the size of the finger.

Alternatively, when the OS detects a sound pattern (or heat, pressure, etc.) over one or more of the translucent fingerprints areas 420, 430, 435 and 440, the OS activates only those areas of virtual pointing device having a finger placed thereon. In this case, the OS assigns a default function (e.g., default function displayed above each fingerprint area) to each fingerprint area having a finger placed over it. However, the fingerprint areas not having a finger placed over them will not be activated and, as such, will not have the default function assigned to them until they are activated. Each fingerprint area may be activated at any time.

As the user slides his/her fingers over touchscreen 400, the OS detects the touchscreen pixel coordinates under the user's moving fingers and, in response, continuously re-defines the "activated" areas of the virtual pointing device to be the touchscreen areas under the fingers. Therefore, the virtual pointing device moves with and according to the movement of the user's fingers. However, while not all of the fingerprint areas may be activated at once, all fingerprint areas move together as one object.

The OS positions pointer 450 near the fingerprint area 420 such that pointer 450 moves in accordance with movement of the virtual pointing device. Therefore, the user could, for example, move the virtual pointing device such that pointer 450 is positioned over a desired object icon. Alternatively, the user could merely lift his hand and place it at a desired location, whereby the OS would re-create the virtual pointing device under the user's fingers at the new location. Additionally, any objects or text selected by the virtual pointing device at the time the hand was lifted would also be re-drawn at the new location.

In this example, the user has placed his/her first finger over fingerprint area 420 to activate that area of the virtual pointing device. If the user desires to re-size the distance between the fingerprint areas of the virtual pointing device, the user merely places a separate finger, one by one, over each displayed fingerprint area (thereby activating them) and then slides each finger outward/inward or upward/downward, as appropriate, to customize the distance between the fingerprint areas of the virtual pointing device. In this manner, the user customizes the shape/size of the virtual pointing device to the shape/size of his/her hand. However, the user must actively customize the shape/size of the virtual pointing device each time he/she uses it.

Once the user positions pointer 450 over a desired object icon 422, the user could, for example, single click his first finger over fingerprint area 420 to transfer focus to object icon 422. However, only generic functions (or previously established functions) can be used for this embodiment.

While the invention has been shown and described with reference to a particular embodiment thereof, it will be understood by those skilled in the art that the foregoing and other changes in form and detail may be made therein without departing from the spirit and scope of the invention, only defined by the appended claims.

Shieh, Johnny Meng-Han

Patent Priority Assignee Title
10025429, Jan 03 2007 Apple Inc. Irregular input identification
10042418, Jul 30 2004 Apple Inc. Proximity detector in handheld device
10055634, Sep 08 2014 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
10120566, Jun 05 2011 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
10142835, Sep 29 2011 Apple Inc. Authentication with secondary approver
10156914, Sep 02 2003 Apple Inc. Ambidextrous mouse
10191576, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
10216279, Jun 19 2008 Tactile Display, LLC Interactive display with tactile feedback
10248221, Aug 17 2009 Apple Inc. Housing as an I/O device
10262182, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
10275585, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
10331259, May 06 2004 Apple Inc. Multipoint touchscreen
10334054, May 19 2016 Apple Inc. User interface for a device requesting remote authorization
10338789, Jul 30 2004 Apple Inc. Operation of a computer with touch screen interface
10372963, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
10386980, Mar 04 2005 Apple Inc. Electronic device having display and surrounding touch sensitive surfaces for user interface and control
10395128, Sep 09 2017 Apple Inc Implementation of biometric authentication
10409434, Dec 22 2010 Apple Inc. Integrated touch screens
10410035, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
10410076, Sep 09 2017 Apple Inc Implementation of biometric authentication
10419933, Sep 29 2011 Apple Inc. Authentication with secondary approver
10438205, May 29 2014 Apple Inc. User interface for payments
10474251, Sep 02 2003 Apple Inc. Ambidextrous mouse
10484384, Sep 29 2011 Apple Inc. Indirect authentication
10516997, Sep 29 2011 Apple Inc. Authentication with secondary approver
10521065, Jan 05 2007 Apple Inc. Touch screen stack-ups
10521579, Sep 09 2017 Apple Inc. Implementation of biometric authentication
10732829, Jun 05 2011 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
10739868, Aug 17 2009 Apple Inc. Housing as an I/O device
10748153, May 29 2014 Apple Inc. User interface for payments
10749967, May 19 2016 Apple Inc. User interface for remote authorization
10783227, Sep 09 2017 Apple Inc. Implementation of biometric authentication
10796309, May 29 2014 Apple Inc. User interface for payments
10803281, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
10860096, Sep 28 2018 Apple Inc Device control using gaze information
10872256, Sep 09 2017 Apple Inc. Implementation of biometric authentication
10902424, May 29 2014 Apple Inc. User interface for payments
10908729, May 06 2004 Apple Inc. Multipoint touchscreen
10915207, May 02 2006 Apple Inc. Multipoint touch surface controller
10921941, Mar 04 2005 Apple Inc. Electronic device having display and surrounding touch sensitive surfaces for user interface and control
10956550, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
10976846, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
10977651, May 29 2014 Apple Inc. User interface for payments
10986252, Jun 07 2015 Apple Inc. Touch accommodation options
10990183, Apr 13 2010 Tactile Displays, LLC Interactive display with tactile feedback
10990184, Apr 13 2010 Tactile Displays, LLC Energy efficient interactive display with energy regenerative keyboard
10996762, Apr 13 2010 Tactile Displays, LLC Interactive display with tactile feedback
11036282, Jul 30 2004 Apple Inc. Proximity detector in handheld device
11100349, Sep 28 2018 Apple Inc Audio assisted enrollment
11163969, Sep 09 2014 HUAWEI TECHNOLOGIES CO , LTD Fingerprint recognition method and apparatus, and mobile terminal
11170085, Jun 03 2018 Apple Inc Implementation of biometric authentication
11175762, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
11200309, Sep 29 2011 Apple Inc. Authentication with secondary approver
11206309, May 19 2016 Apple Inc. User interface for remote authorization
11209961, May 18 2012 Apple Inc Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
11275405, Mar 04 2005 Apple Inc Multi-functional hand-held device
11287942, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces
11354032, Jun 05 2011 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
11360509, Mar 04 2005 Apple Inc. Electronic device having display and surrounding touch sensitive surfaces for user interface and control
11386189, Sep 09 2017 Apple Inc. Implementation of biometric authentication
11393258, Sep 09 2017 Apple Inc. Implementation of biometric authentication
11468155, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
11470225, Jun 07 2015 Apple Inc. Touch accommodation options
11494046, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
11604547, May 06 2004 Apple Inc. Multipoint touchscreen
11619991, Sep 28 2018 Apple Inc. Device control using gaze information
11644865, Aug 17 2009 Apple Inc. Housing as an I/O device
11676373, Jan 03 2008 Apple Inc. Personal computing device control using face detection and recognition
11755712, Sep 29 2011 Apple Inc. Authentication with secondary approver
11765163, Sep 09 2017 Apple Inc. Implementation of biometric authentication
11768575, Sep 09 2013 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on unlock inputs
11775169, Jun 05 2011 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
11809784, Sep 28 2018 Apple Inc. Audio assisted enrollment
11836725, May 29 2014 Apple Inc. User interface for payments
11853518, May 02 2006 Apple Inc. Multipoint touch surface controller
11886651, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
6266050, Aug 08 1997 SAMSUNG ELECTRONICS CO , LTD Portable computer having touch pad input control function
6326979, Jan 22 1998 GE MEDICAL SYSTEMS INFORMATION TECHNOLOGIES, INC System for and method of calibrating a computer monitor
6392638, Jan 16 1998 Sony Corporation Information processing apparatus and display control method of the same information processing apparatus
6943779, Mar 26 2001 Ricoh Company, Limited Information input/output apparatus, information input/output control method, and computer product
7000102, Jun 29 2001 Intel Corporation Platform and method for supporting hibernate operations
7030861, Feb 10 2001 Apple Inc System and method for packing multi-touch gestures onto a hand
7176904, Mar 26 2001 Ricoh Company, Limited Information input/output apparatus, information input/output control method, and computer product
7339580, Jan 26 1998 Apple Inc Method and apparatus for integrating manual input
7511702, Mar 30 2006 Apple Inc Force and location sensitive display
7538760, Mar 30 2006 Apple Inc Force imaging input device and system
7614008, Jul 30 2004 Apple Inc Operation of a computer with touch screen interface
7619618, Jan 26 1998 Apple Inc Identifying contacts on a touch surface
7653883, Jul 30 2004 Apple Inc Proximity detector in handheld device
7656393, Mar 04 2005 Apple Inc Electronic device having display and surrounding touch sensitive bezel for user interface and control
7656394, Jan 26 1998 Apple Inc User interface gestures
7663607, May 06 2004 Apple Inc Multipoint touchscreen
7688314, May 30 2003 Apple Inc Man-machine interface for controlling access to electronic devices
7705830, Feb 10 2001 Apple Inc System and method for packing multitouch gestures onto a hand
7764274, Jan 26 1998 Apple Inc Capacitive sensing arrangement
7782307, Jan 26 1998 Apple Inc Maintaining activity after contact liftoff or touchdown
7812828, Jan 26 1998 Apple Inc Ellipse fitting for multi-touch surfaces
7844914, Jul 30 2004 Apple Inc Activating virtual keys of a touch-screen virtual keyboard
7920131, Apr 25 2006 Apple Inc. Keystroke tactility arrangement on a smooth touch surface
7932897, Aug 16 2004 Apple Inc Method of increasing the spatial resolution of touch sensitive devices
7978181, Apr 25 2006 Apple Inc Keystroke tactility arrangement on a smooth touch surface
8077147, Dec 30 2005 Apple Inc Mouse with optical sensing surface
8115745, Jun 19 2008 Tactile Displays, LLC Apparatus and method for interactive display with tactile feedback
8125463, May 06 2004 Apple Inc. Multipoint touchscreen
8130203, Jan 03 2007 Apple Inc Multi-touch input discrimination
8217908, Jun 19 2008 Tactile Displays, LLC Apparatus and method for interactive display with tactile feedback
8228305, Jun 29 1995 Apple Inc Method for providing human input to a computer
8239784, Jul 30 2004 Apple Inc Mode-based graphical user interfaces for touch sensitive input devices
8243041, Jan 03 2007 Apple Inc. Multi-touch input discrimination
8269727, Jan 03 2007 Apple Inc Irregular input identification
8279180, May 02 2006 Apple Inc Multipoint touch surface controller
8314773, Sep 09 2002 Apple Inc. Mouse having an optically-based scrolling feature
8314775, Jan 26 1998 Apple Inc Multi-touch touch surface
8330727, Jan 26 1998 Apple Inc Generating control signals from multiple contacts
8334846, Jan 26 1998 Apple Inc Multi-touch contact tracking using predicted paths
8381135, Jul 30 2004 Apple Inc Proximity detector in handheld device
8384675, Jan 26 1998 Apple Inc User interface gestures
8384684, Jan 03 2007 Apple Inc. Multi-touch input discrimination
8416209, May 06 2004 Apple Inc. Multipoint touchscreen
8427449, Jun 29 1995 Apple Inc. Method for providing human input to a computer
8432371, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
8436815, May 25 2007 Microsoft Technology Licensing, LLC Selective enabling of multi-input controls
8441453, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8451244, Jun 09 2006 Apple Inc. Segmented Vcom
8466880, Jan 26 1998 Apple Inc. Multi-touch contact motion extraction
8466881, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8466883, Jan 26 1998 Apple Inc. Identifying contacts on a touch surface
8479122, Jul 30 2004 Apple Inc Gestures for touch sensitive input devices
8482533, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8482535, Feb 22 2000 Apple Inc Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
8493330, Jan 03 2007 Apple Inc Individual channel phase delay scheme
8514183, Jan 26 1998 Apple Inc Degree of freedom extraction from multiple contacts
8531425, Jan 03 2007 Apple Inc. Multi-touch input discrimination
8542210, Jan 03 2007 Apple Inc. Multi-touch input discrimination
8552989, Jun 09 2006 Apple Inc Integrated display and touch screen
8576177, Jan 26 1998 Apple Inc Typing with a touch sensor
8576199, Feb 22 2000 Apple Inc Computer control systems
8593426, Jan 26 1998 Apple Inc. Identifying contacts on a touch surface
8605051, May 06 2004 Apple Inc. Multipoint touchscreen
8610674, Jun 29 1995 Apple Inc Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
8612856, Jul 30 2004 Apple Inc. Proximity detector in handheld device
8629840, Jan 26 1998 Apple Inc Touch sensing architecture
8633898, Jan 26 1998 Apple Inc Sensor arrangement for use with a touch sensor that identifies hand parts
8654083, Jun 09 2006 Apple Inc Touch screen liquid crystal display
8654524, Aug 17 2009 Apple Inc. Housing as an I/O device
8665228, Jun 19 2008 Tactile Displays, LLC Energy efficient interactive display with energy regenerative keyboard
8665240, Jan 26 1998 Apple Inc. Degree of freedom extraction from multiple contacts
8674943, Jan 26 1998 Apple Inc Multi-touch hand position offset computation
8698755, Jan 26 1998 Apple Inc Touch sensor contact information
8730177, Jan 26 1998 Apple Inc Contact tracking and identification module for touch sensing
8730192, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8736555, Jan 26 1998 Apple Inc Touch sensing through hand dissection
8743300, Dec 22 2010 Apple Inc. Integrated touch screens
8791921, Jan 03 2007 Apple Inc. Multi-touch input discrimination
8804056, Dec 22 2010 Apple Inc. Integrated touch screens
8816984, May 02 2006 Apple Inc. Multipoint touch surface controller
8866752, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8872785, May 06 2004 Apple Inc. Multipoint touchscreen
8902175, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
8928618, May 06 2004 Apple Inc. Multipoint touchscreen
8943580, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
8982087, May 06 2004 Apple Inc. Multipoint touchscreen
9001068, Jan 26 1998 Apple Inc. Touch sensor contact information
9024906, Jan 03 2007 Apple Inc. Multi-touch input discrimination
9025090, Dec 22 2010 Apple Inc. Integrated touch screens
9035907, May 06 2004 Apple Inc. Multipoint touchscreen
9038167, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9047009, Mar 04 2005 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
9069404, Mar 30 2006 Apple Inc. Force imaging input device and system
9098142, Jan 26 1998 Apple Inc. Sensor arrangement for use with a touch sensor that identifies hand parts
9128601, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9128611, Jun 19 2008 Tactile Displays, LLC Apparatus and method for interactive display with tactile feedback
9134896, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9146414, Dec 22 2010 Apple Inc. Integrated touch screens
9239673, Jan 26 1998 Apple Inc Gesturing with a multipoint sensing device
9239677, Jul 30 2004 Apple Inc. Operation of a computer with touch screen interface
9244561, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
9250795, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9256322, Jan 03 2007 Apple Inc. Multi-touch input discrimination
9262029, May 02 2006 Apple Inc. Multipoint touch surface controller
9268429, Jun 09 2006 Apple Inc. Integrated display and touch screen
9274647, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9292111, Jul 30 2004 Apple Inc Gesturing with a multipoint sensing device
9298310, Jan 26 1998 Apple Inc. Touch sensor contact information
9304624, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9329717, Jan 26 1998 Apple Inc Touch sensing with mobile sensors
9329771, Sep 24 2007 Apple Inc Embedded authentication systems in an electronic device
9342180, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
9342674, May 30 2003 Apple Inc. Man-machine interface for controlling access to electronic devices
9348452, Jan 26 1998 Apple Inc. Writing using a touch sensor
9348458, Jul 30 2004 Apple Inc Gestures for touch sensitive input devices
9383855, Jan 26 1998 Apple Inc. Identifying contacts on a touch surface
9411468, Jan 03 2007 Apple Inc. Irregular input identification
9448658, Jan 26 1998 Apple Inc Resting contacts
9454277, May 06 2004 Apple Inc. Multipoint touchscreen
9495531, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9513705, Jun 19 2008 Tactile Displays, LLC Interactive display with tactile feedback
9513744, Jul 03 2002 Apple Inc Control systems employing novel physical controls and touch screens
9513799, Jun 05 2011 Apple Inc. Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities
9519771, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9547394, May 02 2006 Apple Inc. Multipoint touch surface controller
9552100, Jan 26 1998 Apple Inc. Touch sensing with mobile sensors
9552126, May 25 2007 Microsoft Technology Licensing, LLC Selective enabling of multi-input controls
9557846, Oct 04 2012 Corning Incorporated Pressure-sensing touch system utilizing optical and capacitive systems
9575610, Jun 09 2006 Apple Inc. Touch screen liquid crystal display
9600037, Aug 17 2009 Apple Inc. Housing as an I/O device
9606668, Jul 30 2004 Apple Inc. Mode-based graphical user interfaces for touch sensitive input devices
9626032, Jan 26 1998 Apple Inc Sensor arrangement for use with a touch sensor
9710095, Jan 05 2007 Apple Inc Touch screen stack-ups
9727193, Dec 22 2010 Apple Inc. Integrated touch screens
9758042, Jun 29 1995 Apple Inc. Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics
9778807, Jan 03 2007 Apple Inc. Multi-touch input discrimination
9785258, Sep 02 2003 Apple Inc. Ambidextrous mouse
9804701, Jan 26 1998 Apple Inc. Contact tracking and identification module for touch sensing
9847999, May 19 2016 Apple Inc User interface for a device requesting remote authorization
9898083, Feb 09 2009 Volkswagen AG Method for operating a motor vehicle having a touch screen
9898642, Sep 09 2013 Apple Inc Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
9953152, Sep 24 2007 Apple Inc. Embedded authentication systems in an electronic device
9983742, Mar 04 2005 Apple Inc. Electronic device having display and surrounding touch sensitive bezel for user interface and control
RE40153, Feb 10 2001 Apple Inc Multi-touch system and method for emulating modifier keys via fingertip chords
RE40993, Jan 28 2001 Apple Inc System and method for recognizing touch typing under limited tactile feedback conditions
RE42038, May 30 2003 Apple Inc Man-machine interface for controlling access to electronic devices
Patent Priority Assignee Title
4760386, Jun 13 1986 International Business Machines Corporation Automatic hiding and revealing of a pointer during keyboard activity
4914624, May 06 1988 Tyco Electronics Corporation Virtual button for touch screen
5319747, Apr 02 1990 U.S. Philips Corporation Data processing system using gesture-based input data
5327161, Aug 09 1989 3M Innovative Properties Company System and method for emulating a mouse input device with a touchpad input device
5335557, Nov 26 1991 Sandio Technology Corporation Touch sensitive input control device
5376946, Jul 08 1991 CTI ELECTRONICS CORPORATION; ES BETA, INC Computer mouse simulator device
5428367, Jul 08 1991 CTI ELECTRONICS CORPORATION; ES BETA, INC Computer mouse simulator having see-through touchscreen device and external electronic interface therefor
5432531, Dec 14 1990 International Business Machines Corporation Coordinate processor for a computer system having a pointing device
5528263, Jun 15 1994 Daniel M., Platzker Interactive projected video image display system
5539429, Oct 24 1989 Mitsubishi Denki Kabushiki Kaisha Touch device panel
5543591, Jun 08 1992 SYNAPTICS, INC Object position detector with edge motion feature and gesture recognition
5568604, Dec 31 1992 Qwest Communications International Inc Method and system for generating a working window in a computer system
5612719, Dec 03 1992 Apple Inc Gesture sensitive buttons for graphical user interfaces
5617117, Mar 30 1994 Matsushita Electric Industrial Co., Ltd. Input device
5621438, Oct 12 1992 Hitachi, Ltd. Pointing information processing apparatus with pointing function
5790104, Jun 25 1996 International Business Machines Corporation Multiple, moveable, customizable virtual pointing devices
///
Executed onAssignorAssigneeConveyanceFrameReelDoc
Jun 21 1996SHIEH, JOHNNY M International Business Machines CorporationASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0096080454 pdf
Jun 25 1996International Business Machines Corporation(assignment on the face of the patent)
May 20 2005International Business Machines CorporationLENOVO SINGAPORE PTE LTD ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS 0168910507 pdf
Date Maintenance Fee Events
Mar 31 2000ASPN: Payor Number Assigned.
Feb 19 2003REM: Maintenance Fee Reminder Mailed.
Aug 04 2003EXP: Patent Expired for Failure to Pay Maintenance Fees.


Date Maintenance Schedule
Aug 03 20024 years fee payment window open
Feb 03 20036 months grace period start (w surcharge)
Aug 03 2003patent expiry (for year 4)
Aug 03 20052 years to revive unintentionally abandoned end. (for year 4)
Aug 03 20068 years fee payment window open
Feb 03 20076 months grace period start (w surcharge)
Aug 03 2007patent expiry (for year 8)
Aug 03 20092 years to revive unintentionally abandoned end. (for year 8)
Aug 03 201012 years fee payment window open
Feb 03 20116 months grace period start (w surcharge)
Aug 03 2011patent expiry (for year 12)
Aug 03 20132 years to revive unintentionally abandoned end. (for year 12)