A scroll control recognizes a touch input directed to a selectable item while the selectable item is scrolling on a touch display. The scroll control determines if the selectable item is scrolling above a threshold velocity when the touch input directed to the selectable item is recognized. If the selectable item is determined to be scrolling above the threshold velocity, scrolling of the selectable item is stopped. If the selectable item is determined to be scrolling below the threshold velocity, the selectable item is selected.
|
1. A method of interpreting user input, the method comprising:
scrolling a selectable item on a touch display;
recognizing a touch input directed to the selectable item while the selectable item is scrolling on the touch display;
determining if the selectable item is scrolling above a threshold velocity when the touch input directed to the selectable item is recognized, the threshold velocity being selected for the selectable item from a plurality of different threshold velocities based on a size of the selectable item to which the touch input is directed, each of the plurality of different threshold velocities corresponding to one or more different selectable items having a different size; and
stopping scrolling of the selectable item if the selectable item is determined to be scrolling above the threshold velocity or selecting the selectable item if the selectable item is determined to be scrolling below the threshold velocity.
9. A computing device, comprising:
a touch display;
a logic subsystem operatively coupled to the touch display; and
a data-holding subsystem holding instructions executable by the logic subsystem to:
scroll a plurality of selectable items on the touch display;
recognize a touch input directed to one of the plurality of selectable items while that selectable item is scrolling on the touch display;
obtain a dimension of that selectable item in a direction of scrolling, the dimension including a pixel size of the selectable item in the direction of scrolling;
apply a multiplier to the obtained dimension for that selectable item to find a threshold velocity specific to that selectable item;
stop scrolling the plurality of selectable items if that selectable item is scrolling above the threshold velocity when the touch input directed to that selectable item is recognized; and
select that selectable item if that selectable item is scrolling below the threshold velocity when the touch input directed to that selectable item is recognized.
19. A method of interpreting user input, the method comprising:
scrolling a large selectable item and a small selectable item on a touch display, the large selectable item having a larger size than the small selectable item;
recognizing a first touch input directed to the large selectable item while the large selectable item is scrolling on the touch display;
stopping scrolling of the large selectable item if the large selectable item is determined to be scrolling above a first threshold velocity that is based on the larger size or selecting the large selectable item if the large selectable item is scrolling below the first threshold velocity;
recognizing a second touch input directed to the small selectable item while the small selectable item is scrolling on the touch display;
stopping scrolling of the small selectable item if the small selectable item is determined to be scrolling above a second threshold velocity that is based on a size of the small selectable item or selecting the small selectable item if the small selectable item is scrolling below the second threshold velocity.
2. The method of
3. The method of
4. The method of
5. The method of
6. The method of
7. The method of
8. The method of
10. The computing device of
11. The computing device of
12. The computing device of
13. The computing device of
14. The computing device of
15. The computing device of
16. The computing device of
17. The computing device of
20. The method of
|
A touch display is a display that serves the dual function of visually presenting information and receiving user input. Touch displays may be utilized with a variety of different devices to provide a user with an intuitive input mechanism that can be directly linked to information visually presented by the touch display. A user may use touch input to push soft buttons, turn soft dials, size objects, orientate objects, or perform a variety of different inputs.
A scroll control recognizes a touch input directed to a selectable item while the selectable item is scrolling on a touch display. The scroll control determines if the selectable item is scrolling above a threshold velocity when the touch input directed to the selectable item is recognized. If the selectable item is determined to be scrolling above the threshold velocity, scrolling of the selectable item is stopped. If the selectable item is determined to be scrolling below the threshold velocity, the selectable item is selected.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
Computing device 10 is shown visually presenting an application-launching user interface 13 that includes a plurality of icons that correspond to different applications that the computing device is configured to run. Application-launching user interface 13 is shown displaying a shopping cart icon 14, a camera icon 15, and a musical note icon 16. Such icons may respectively correspond to a shopping application, a photo-organizing application, and a music-organizing application. The icons are selectable items which may be selected by touch input from the user. Furthermore, the icons may be scrolled across touch display 11, so that other icons may be brought into view. For example, the icons may be scrolled across touch display 11 with a scrolling inertia in response to a swiping touch input.
While described here in the context of an application-launching user interface visually presenting icons, it is to be understood that a touch display may visually present one or more other types of selectable items. The present disclosure is compatible with all such selectable items. Nonlimiting examples of such selectable items include words in a list, points on a map, and photos in an array, among others.
Turning to
At 22, method 20 includes recognizing a touch input directed to the selectable item while the selectable item is scrolling on the touch display. The touch input may be recognized in a variety of different ways depending on the type of touch display on which the selectable item is being displayed. As an example, the selectable item may be presented on a capacitive touch screen, in which case recognizing the touch input directed to the selectable item may include recognizing a change in capacitance at or near a portion of the screen displaying the selectable item. As another example, the selectable item may be presented on a surface computing device that uses infrared light to track user input, in which case recognizing the touch input directed to the selectable item may include recognizing a change in an amount of infrared light reflecting from a portion of the surface displaying the selectable item. Other touch computing systems may recognize touch input in a different manner without departing from the scope of this disclosure.
At 23, method 20 includes determining if the selectable item is scrolling above a threshold velocity when the touch input directed to the selectable item is recognized. The threshold velocity may be selected based on a plurality of different considerations. As an example, user interaction with a touch display may be examined, and it can be found at what scrolling speed a user feels comfortable selecting a scrolling item on a particular device. It is thought that if an item is scrolling too quickly, a user may not wish to select the scrolling item, but instead stop its scrolling. On the other hand, if a selectable item is scrolling with a relatively slow velocity, a user may expect the selectable item to be selected instead of merely stopping scrolling of the item.
The threshold velocity may be selected to be a certain number of pixels per second (e.g., 1, 2, 5, 10, or another number of pixels per second). The number of pixels per second may be modified depending on the size of the display, the size of the selectable item, the context in which the selectable item is presented, or other suitable factors.
The threshold velocity may be selected to correspond to the selectable item moving a distance equal to a proportional dimension of the selectable item in a direction of scrolling. For example, if a selectable item is scrolling horizontally, and the selectable item is 100 pixels wide in the horizontal dimension, the threshold velocity may be set at some percentage of 100 pixels per second. As specific nonlimiting examples, the threshold velocity may be set to equal the horizontal dimension (e.g., 100 pixels per second), to be twice as large as the horizontal dimension (e.g., 200 pixels per second), to be one half as large as the horizontal dimension (e.g., 50 pixels per second), to be one quarter as large as the horizontal dimension (e.g., 25 pixels per second), or to be any other ratio that yields a threshold velocity below which a user expects a scrolling selectable item to be selected as opposed to merely stopping responsive to user input directed to the selectable item.
In other embodiments, other metrics may be used in determining the threshold velocity, such as the total target area of the selectable item, the spacing between adjacent selectable items, and a user's level of proficiency with the user interface, among others.
As shown at 24, method 20 includes stopping scrolling of the selectable item if the selectable item is determined to be scrolling above the threshold velocity.
Turning back to
When a selectable item is selected, an alternative view of the selectable item may be displayed. This is somewhat schematically shown at times t2 and t3 of
In some embodiments, the above described methods and processes may be tied to a computing system. As an example,
Logic subsystem 51 may include one or more physical devices configured to execute one or more instructions. For example, the logic subsystem may be configured to execute one or more instructions that are part of one or more programs, routines, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more devices, or otherwise arrive at a desired result. The logic subsystem may include one or more processors that are configured to execute software instructions. Additionally or alternatively, the logic subsystem may include one or more hardware or firmware logic machines configured to execute hardware or firmware instructions. The logic subsystem may optionally include individual components that are distributed throughout two or more devices, which may be remotely located in some embodiments.
Data-holding subsystem 52 may include one or more physical devices configured to hold data and/or instructions executable by the logic subsystem to implement the herein described methods and processes. When such methods and processes are implemented, the state of data-holding subsystem 52 may be transformed (e.g., to hold different data). Data-holding subsystem 52 may include removable media and/or built-in devices. Data-holding subsystem 52 may include optical memory devices, semiconductor memory devices, and/or magnetic memory devices, among others. Data-holding subsystem 52 may include devices with one or more of the following characteristics: volatile, nonvolatile, dynamic, static, read/write, read-only, random access, sequential access, location addressable, file addressable, and content addressable. In some embodiments, logic subsystem 51 and data-holding subsystem 52 may be integrated into one or more common devices, such as an application specific integrated circuit or a system on a chip.
Touch display 53 may be used to present a visual representation of data held by data-holding subsystem 52. As the herein described methods and processes change the data held by the data-holding subsystem, and thus transform the state of the data-holding subsystem, the state of touch display 53 may likewise be transformed to visually represent changes in the underlying data. Touch display 53 may be combined with logic subsystem 51 and/or data-holding subsystem 52 in a shared enclosure, or touch display 53 may be a peripheral display device.
Continuing with
To sense objects that are contacting or near to display surface 64, surface computing system 60 may include one or more image capture devices (e.g., sensor 78, sensor 80, sensor 82, sensor 84, and sensor 86) configured to capture an image of the backside of display surface 64, and to provide the image to logic subsystem 74. The diffuser screen layer 72 can serve to reduce or avoid the imaging of objects that are not in contact with or positioned within a few millimeters or other suitable distance of display surface 64, and therefore helps to ensure that at least objects that are touching transparent portion 70 of display surface 64 are detected by the image capture devices.
These image capture devices may include any suitable image sensing mechanism. Examples of suitable image sensing mechanisms include but are not limited to CCD and CMOS image sensors. Further, the image sensing mechanisms may capture images of display surface 64 at a sufficient frequency to detect motion of an object across display surface 64. Display surface 64 may alternatively or further include an optional capacitive, resistive or other electromagnetic touch-sensing mechanism, which may communicate touch input to the logic subsystem via a wired or wireless connection 88.
The image capture devices may be configured to detect reflected or emitted energy of any suitable wavelength, including but not limited to infrared and visible wavelengths. To assist in detecting objects placed on display surface 64, the image capture devices may further include an additional light source, such as one or more light emitting diodes (LEDs).
In some examples, one or more of infrared light source 90 and/or infrared light source 92 may be positioned at any suitable location within surface computing system 60. In the example of
It is to be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated may be performed in the sequence illustrated, in other sequences, in parallel, or in some cases omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and nonobvious combinations and subcombinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.
Patent | Priority | Assignee | Title |
10136043, | Aug 07 2015 | GOOGLE LLC | Speech and computer vision-based control |
10225511, | Dec 30 2015 | GOOGLE LLC | Low power framework for controlling image sensor mode in a mobile image capture device |
10728489, | Dec 30 2015 | GOOGLE LLC | Low power framework for controlling image sensor mode in a mobile image capture device |
10732809, | Dec 30 2015 | GOOGLE LLC | Systems and methods for selective retention and editing of images captured by mobile image capture device |
11159763, | Dec 30 2015 | GOOGLE LLC | Low power framework for controlling image sensor mode in a mobile image capture device |
9769367, | Aug 07 2015 | GOOGLE LLC | Speech and computer vision-based control |
9836484, | Dec 30 2015 | GOOGLE LLC | Systems and methods that leverage deep learning to selectively store images at a mobile image capture device |
9836819, | Dec 30 2015 | GOOGLE LLC | Systems and methods for selective retention and editing of images captured by mobile image capture device |
9838641, | Dec 30 2015 | GOOGLE LLC | Low power framework for processing, compressing, and transmitting images at a mobile image capture device |
Patent | Priority | Assignee | Title |
6061177, | Dec 19 1996 | Integrated computer display and graphical input apparatus and method | |
6545669, | Mar 26 1999 | Object-drag continuity between discontinuous touch-screens | |
6690387, | Dec 28 2001 | KONINKLIJKE PHILIPS N V | Touch-screen image scrolling system and method |
7180501, | Mar 23 2004 | Fujitsu Limited | Gesture based navigation of a handheld user interface |
20020015064, | |||
20050206658, | |||
20060055662, | |||
20070061717, | |||
20070229466, | |||
20080034293, | |||
20080040687, | |||
20080174570, | |||
20080195972, | |||
20080250352, | |||
20090106687, | |||
20110010658, | |||
WO2008055514, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Dec 03 2008 | HOOVER, PAUL ARMISTEAD | Microsoft Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 023066 | /0670 | |
Dec 05 2008 | Microsoft Corporation | (assignment on the face of the patent) | / | |||
Oct 14 2014 | Microsoft Corporation | Microsoft Technology Licensing, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034564 | /0001 |
Date | Maintenance Fee Events |
Dec 28 2017 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Dec 22 2021 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Jul 08 2017 | 4 years fee payment window open |
Jan 08 2018 | 6 months grace period start (w surcharge) |
Jul 08 2018 | patent expiry (for year 4) |
Jul 08 2020 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jul 08 2021 | 8 years fee payment window open |
Jan 08 2022 | 6 months grace period start (w surcharge) |
Jul 08 2022 | patent expiry (for year 8) |
Jul 08 2024 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jul 08 2025 | 12 years fee payment window open |
Jan 08 2026 | 6 months grace period start (w surcharge) |
Jul 08 2026 | patent expiry (for year 12) |
Jul 08 2028 | 2 years to revive unintentionally abandoned end. (for year 12) |