content transfer involving a gesture is described. In an implementation, a method is implemented by a mobile communications device that includes recognizing a gesture input via a touchscreen of the mobile communications device that is indicative of a direction, the touchscreen including a display of content. One or more other mobile communications devices are located that are positioned, approximately, along the indicated direction of the gesture. A communication is formed to transfer the content to the located one or more other mobile communications devices.
|
1. A method implemented by a mobile communications device, the method comprising:
outputting content to a touchscreen;
recognizing, concurrently with the outputting, a gesture involving a drag or flick via the touchscreen that is indicative of a direction;
determining a direction of the gesture relative to an edge of the touchscreen;
determining if one or more other mobile communications devices are located in the direction relative to the edge of the touchscreen by using one or more directional antennas in the mobile communications devices to determine the direction; and
forming a communication to communicate to the one or more other mobile communications devices that are located in the direction relative to the edge of the touchscreen.
6. A mobile communications device comprising:
one or more processors;
memory, coupled to the one or more processors, the memory comprising instructions executable by the one or more processors to perform:
outputting content to a touchscreen;
recognizing, concurrently with the outputting, a gesture involving a drag or flick via the touchscreen that is indicative of a direction;
determining a direction of the gesture relative to an edge of the touchscreen;
determining if one or more other mobile communications devices are located in the direction relative to the edge of the touchscreen by using one or more directional antennas in the mobile communications devices to determine the direction; and
forming a communication to communicate to the one or more other mobile communications devices that are located in the direction relative to the edge of the touchscreen.
12. A system comprising:
a first mobile communications device configured to:
output content to a touchscreen;
recognize, concurrently with the output of the content, a gesture involving a drag or flick via the touchscreen that is indicative of a direction;
determine a direction of the gesture relative to an edge of the touchscreen;
determine if one or more other mobile communications devices are located in the direction relative to the edge of the touchscreen;
form a communication to communicate to the one or more other mobile communications devices that are located in the direction relative to the edge of the touchscreen;
responsive to forming the communication, transfer the content from the first mobile communications device to one or more of the other mobile communications devices;
the one or more other mobile communications devices configured to:
receive a communication of content from the first mobile communication device, the content communicated by the first mobile communications device responsive to recognition of a gesture that is indicative of a direction relative to an edge of the touchscreen of the first mobile communication device; and
display an indication of the direction from which the communication of the content was received by the one or more other mobile communications devices.
2. The method of
responsive to forming the communication, transferring the content from the first mobile communications device to one or more of the other mobile communications devices.
3. The method of
determining that the one or more other mobile communications devices are authorized to receive the transferred content.
4. The method of
5. The method of
determining that the one or more other mobile communications devices are within a predetermined range.
7. The mobile communications device of
responsive to forming the communication, transferring the content from the first mobile communications device to one or more of the other mobile communications devices.
8. The mobile communications device of
9. The mobile communications device of
determining that the one or more other mobile communications devices are within a predetermined range.
10. The mobile communications device of
determining that the one or more other mobile communications devices are authorized to receive the transferred content.
11. The mobile communications device of
14. The system of
15. The system of
|
The application claims priority under 35 U.S.C. Section 120 as a continuation of U.S. patent application Ser. No. 12/558,782, filed Sep. 14, 2009, and titled “Content Transfer Involving a Gesture,” the entire disclosure of which is incorporated by reference.
Mobile communication devices (e.g., wireless phones) have become an integral part of everyday life. However, the form factor employed by conventional mobile communications devices is typically limited to promote mobility of the device.
For example, the mobile communications device may have a relatively limited amount of display area when compared to a conventional desktop computer, e.g., a PC. Therefore, conventional techniques used to interact with a desktop computer may be inefficient when employed by a mobile communications device. For example, traditional techniques that were used to transfer content typically forced a user to navigate through a series of menus to select content to transfer, select a device to receive the content, and then initiate the transfer. Accordingly, these steps may result in user frustration, especially when transferring multiple items of items of content to different users.
Content transfer involving a gesture is described. In an implementation, a method is implemented by a mobile communications device that includes recognizing a gesture input via a touchscreen of the mobile communications device that is indicative of a direction, the touchscreen including a display of content. One or more other mobile communications devices are located that are positioned, approximately, along the indicated direction of the gesture. A communication is formed to transfer the content to the located one or more other mobile communications devices.
In an implementation, one or more computer-readable storage media include instructions that are executable by a mobile communications device to determine whether one or more other mobile communications devices are within a predetermined range. If so, data is stored that describes a relative position of the one or more other mobile communications devices in relation to the mobile communications device. The relative position is to be used in determining which of the one or more mobile communications devices are to share content from the mobile communications device in response to a gesture received via the mobile communications device.
In an implementation, a mobile communications device includes a touchscreen having a plurality of edges, a processor, and memory configured to maintain content and an operating system. The operating system is executable on the processor to determine that a gesture received via the touchscreen is indicative of a particular one of four edges of the touchscreen, the gesture being received during display of the content on the touchscreen. The operating system is also executable to determine that one or more other mobile communications devices are positioned, approximately, along the particular edge of the touchscreen. The operating system is further executable to form a communication to wirelessly transfer the content to the located one or more other mobile communications devices.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
Overview
Mobile communications devices typically have a small form factor to aide mobility of the mobile communications device. For example, the mobile communications device (e.g., a mobile phone) may be configured with a relatively minimal amount of display area and limited input devices (e.g., a keypad) so that the device may be easily transported. Consequently, traditional techniques used to interact with a conventional computer (e.g., a desktop PC) may be frustrating when used in conjunction with a mobile communications device.
For instance, conventional techniques that were used to transfer content may involve multiple steps that are accessed through a series of menus. Consequently, these conventional techniques may be frustrating to users when incorporated by a mobile communications device, especially when transfer of multiple items of content is desired.
In an implementation, content transfer involving a gesture is described. The gesture received by a mobile communications device is indicative of a direction. Additionally, this gesture may be received during output of content, such as during display of an image on the touchscreen such that the gesture also indicates what content is to be transferred.
For example, a “flick” or “drag” may be input via a touchscreen of the mobile communications device during a display of content (e.g., an image) that is directed towards an edge of the touchscreen. The mobile communications device may then determine if one or more other mobile communications devices are located along that edge of the touchscreen, e.g., in the approximate direction of the gesture. If so, content that was concurrently output during the gesture may be transferred to the one or more other mobile communications devices. In this way, a single gesture may be used to select content for transfer and indicate where the content is to be transferred without navigating through multiple menus. A variety of other examples are also contemplated, further discussion of which may be found in relation to the following sections.
In the following discussion, a variety of example implementations of a mobile communications device (e.g., a wireless phone) are described. Additionally, a variety of different functionality that may be employed by the mobile communications device is described for each example, which may be implemented in that example as well as in other described examples. Accordingly, example implementations are illustrated of a few of a variety of contemplated implementations. Further, although a mobile communications device having one or more modules that are configured to provide telephonic functionality are described, a variety of other computing devices are also contemplated, such as personal digital assistants, mobile music players, dedicated messaging devices, portable game devices, netbooks, and so on.
Example Implementations
The mobile communications device 102 is further illustrated as including a first housing 104 and a second housing 106 that are connected via a slide 108 such that the first and second housings 104, 106 may move (e.g., slide) in relation to one another. Although sliding is described, it should be readily apparent that a variety of other movement techniques are also contemplated, e.g., a pivot, a hinge and so on.
The first housing 104 includes a display device 110 that may be used to output a variety of content, such as a caller identification (ID), contacts, images (e.g., photos) as illustrated, email, multimedia messages, Internet browsing, game play, music, video, and so on. In an implementation, the display device 110 is configured to function as an input device by incorporating touchscreen functionality, e.g., through capacitive, surface acoustic wave, resistive, optical, strain gauge, dispersive signals, acoustic pulse, and other touchscreen functionality. The touchscreen functionality (as well as other functionality such as track pads) may be used to detect gestures, further discussion of which may be found in relation to the later figures.
The second housing 106 is illustrated as including a keyboard 112 that may also be used to provide inputs to the mobile communications device 102. Although the keyboard 112 is illustrated as a QWERTY keyboard, a variety of other examples are also contemplated, such as a keyboard that follows a traditional telephone keypad layout (e.g., a twelve key numeric pad found on basic telephones), keyboards configured for other languages (e.g., Cyrillic), and so on.
In the “open” configuration as illustrated in the example implementation 100 of
The form factor employed by the mobile communications device 102 may be suitable to support a wide variety of features. For example, the keyboard 112 is illustrated as supporting a QWERTY configuration. This form factor may be convenient to a user to utilize the previously described functionality of the mobile communications device 102, such as to compose texts, play games, check email, “surf” the Internet, provide status messages for a social network, and so on.
The mobile communications device 102 is also illustrated as including a communication module 114. The communication module 114 is representative of functionality of the mobile communications device 102 to communicate via a network 116, such as with another mobile communications device 118. For example, the communication module 114 may include telephone functionality to make and receive telephone calls from the mobile communications device 118. The communication module 114 may also include a variety of other functionality, such as to capture content, form short message service (SMS) text messages, multimedia messaging service (MMS) messages, emails, status updates to be communicated to a social network service, and so on. A variety of other examples are also contemplated, such as blogging, instant messaging, and so on.
The mobile communications device 102 is also illustrated as including a content transfer module 120. The content transfer module 120 is representative of functionality of the mobile communications device 102 to manage a user interface 122 to transfer content 124 to the other mobile communications device 118 via the network 116.
The content transfer module 120 may cause the content to be transferred in a variety of different ways over a variety of networks 116. For example, the content transfer module 120 may communicate directly with the mobile communications device 118 over the network 116, e.g., when configured as a WiFi network or other local wireless network such as Bluetooth. The content transfer module 120 may also communicate the content 124 indirectly over the network 116 when configured as the Internet, such as through a content transfer service 126 implemented using one or more servers 128, further discussion of which may be found in relation to
The mobile communications device 102 is also illustrated as receiving a gesture 210 input by a user's finger 212. The gesture 210 may then be recognized as such by the content transfer module 120 using touchscreen functionality. For example, the gesture 210 may be input via a flick or drag of the user's finger 212 across the display device 110 and thus the content transfer module 120 may recognize this gesture 210 and react accordingly. Additionally, the gesture 210 may indicate a relative direction, which in this example is oriented generally towards the top edge 202 of the display device 110.
In response to the gesture, the content transfer module 120 may determine whether another mobile communications device is positioned along the top edge 202 and/or in the indicated direction of the gesture 210. In the illustrated example, the other mobile communications device 118 is positioned both along the top edge 202 of the mobile communications device 102 and in the indicated direction of the gesture 210. Accordingly, content 124 that is displayed on the display device 110 when the gesture 210 is received is then transferred to the other mobile communications device 118.
In the illustrated example, the transfer of the content 124 indicates the direction of the transfer by following the user's finger 212 on the display device 110. The content 124 may also indicate a direction from which it was “received” on the other mobile communications device 118, such as by progressively showing the content 124 as being received from the general direction of the mobile communications device 102 using an animation. Thus, the display of content 124 on the other mobile communications device 124 may indicate a relative location of the mobile communications device 102 that transferred the content 124. A variety of other implementations are also contemplated, such as to display the content 124 on the other mobile communications device 118 when the communication is completed (e.g., the content 124 has completed transfer), and so on.
As previously described, the content 124 may be transferred in a variety of ways. For example, the mobile communications devices 102, 118 may be connected over a local wireless network such that the content 124 is not transferred through an intermediary. In another example, an intermediary (e.g., a web service) is used to transfer the content 124, such as to transfer the content 124 via the internet. In a further example, an intermediary is used to locate other mobile communications devices (e.g., mobile communications device 118) for the transfer (which may or may not be involved in transfer of the actual content 124), further discussion of which may be found in relation to the following figure.
The system 300 also includes a content transfer service 126 having a transfer manager module 306. The transfer manager module 306 is representative of functionality to aid content transfer between the mobile communications devices 102, 118. For example, the transfer manager module 306 is illustrated as having a position module 308 that is representative of functionality to track position of the mobile communications devices 102, 118. The position may be tracked in a variety of ways, such as through global positioning data that is obtained from the mobile communications devices 102, 118, indications of locations based on transmitters used to communicate with the devices (e.g., through triangulation), and so on. Thus, the location of the mobile communications device 102, 118 may be determined through communication with the mobile communications device 102, 118 themselves and/or other devices, e.g., cell towers for triangulation.
The position information (and other information) stored by the content transfer service 126 may be used to aid the content transfer process. For instance, the content transfer service 126 may record authentication information of the mobile communications devices, 102, 118 as well as connection information (e.g., a phone number and type of device). The transfer manager module 306 of the content transfer service may then use this information to recognize the devices as an authorized device, and thus a “friend device”.
This information may then be published by the content transfer service 126 to the mobile communications devices 102, 118 at set time intervals. The mobile communications devices 102, 118 may then automatically detect its “friend devices,” such as at predetermined time intervals (e.g., ten seconds) to locate devices within a predetermined range. For example, the range may be set for likelihood of content transfer (e.g., two meters) to limit the number of conflicts between other mobile communications devices, suitability for content transfer (e.g., effective range of a local wireless network), and so on.
For each device that meets these criteria, the mobile communications devices 102, 118 may automatically connect with the found device and record a relative position. The recorded relative position may have the following data structure:
// Relative geo position of a “friend device”
Strucut RelativePosition
{
Public int PhoneNumber; // The number of the phone which is globally
unique.
Public RelativeSide relativeSide; // Indicate where the “Friend Phone” is
currently located
Public x,y,z; // relative location in 3D world where the “Friend Phone” is
currently located
}
Enum RelativeSide
{
Left,
Right,
Top,
Bottom
};
Thus, each of the mobile communications devices 102, 118 may record a relative position of other devices that are within range. In an implementation, the devices inform each other when there is a change in position. For example, the mobile communications device 102 may inform the other mobile communications device 118 of a change in coordinates so that the device may update its relative geographic position. This data may also be set to “time out” if updates and/or confirmations of location are not received at predetermined intervals. Therefore, in this example the devices are “aware” of the position of each other before a gesture is received to transfer the content, thereby promoting efficiency in the transfer of the content. A variety of other implementation are also contemplated, such as to locate the devices in response to the gesture. Further discussion of gestures, calculation of relative position, and content transfer may be found in relation to the following procedures.
Example Procedures
The following discussion describes content transfer techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to the environment 100 and systems 200-300 of
A relative location of the one or more other mobile communications devices is computed in relation to the mobile communications device (block 404). For example, the content transfer module 120 may obtain the coordinates of the mobile communications device 118 and compare them with the coordinates of the mobile communications device 102 to compute a relative location.
The relative location may be described in a variety of ways. For example, the relative location may be described directionally, such as in a radial manner (e.g., as degrees that follow a basic layout of a compass), generally (e.g., aligned with a top edge, bottom edge, etc.), and so on. In another example, the relative location may be described using relative distances, e.g., close, middle, far. A variety of other examples are also contemplated, such as the storing of actual coordinates which are then used to calculate the relative location at a later time, e.g., after receipt of a gesture.
The relative location is stored on the mobile communications device (block 406). Thus, in this manner the mobile communications device 102 is “ready” for receipt of a gesture to indicate where to transfer the content and can do so in a efficient manner. Additionally, the relative location may be updated on the mobile communications device (block 408). For example, if the mobile communications device 118 is moved, it may communicate an update to the content transfer module 120 of the mobile communications device 102. In another example, the content transfer service 126 and/or the mobile communications device 102 may determine when the other mobile communications device 118 is moved (e.g., through periodic polling) and generate an update. This stored data may then be used to support a content transfer process, further discussion of which may be found in relation to the following figure.
One or more other mobile communications devices are located that are positioned, approximately, along the indicated direction of the gesture (block 504). The content transfer module 120, for instance, may query data stored by the procedure 400 of
A user interface is output that is configured to receive a selection of one or more of the mobile communications devices if a plurality of other mobile communications devices is positioned, approximately, along the indicated direction of the gesture (block 506). For example, a plurality of devices may be located in a direction. Accordingly, in this implementation a user interface is output that is configured to enable a user to select one or more of the devices to receive the content. A variety of other implementations are also contemplated, such as automatic transfer to each friend device in the approximate direction.
A communication is then formed to transfer the content to the located one or more other mobile communications devices (block 508). The communication may be formed in a variety of ways, such as for transfer over a local wireless network, a wide area network, the internet, and so on.
The content is displayed during the transfer to indicate a direction of the transfer (block 510). Referring again to
Example Device
Device 600 includes input 602 that may include Internet Protocol (IP) inputs as well as other input devices, such as the keyboard 112 of
Device 600 also includes one or more processors 606 (e.g., any of microprocessors, controllers, and the like) which process various computer-executable instructions to control the operation of device 600 and to communicate with other electronic devices. Device 600 can be implemented with computer-readable media 608, such as one or more memory components, examples of which include random access memory (RAM) and non-volatile memory (e.g., any one or more of a read-only memory (ROM), flash memory, EPROM, EEPROM, etc.).
Computer-readable media 608 provides data storage to store content and data 610, as well as device applications and any other types of information and/or data related to operational aspects of device 600. For example, an operating system 612 can be maintained as a computer application with the computer-readable media 608 and executed on processor 606. Device applications can also include a communication manager module 614 (which may be used to provide telephonic functionality) and a media manager 616.
Device 600 also includes an audio and/or video output 618 that provides audio and/or video data to an audio rendering and/or display system 620. The audio rendering and/or display system 620 can be implemented as integrated component(s) of the example device 600, and can include any components that process, display, and/or otherwise render audio, video, and image data. Device 600 can also be implemented to provide a user tactile feedback, such as vibrate and haptics.
Generally, the blocks may be representative of modules that are configured to provide represented functionality. Further, any of the functions described herein can be implemented using software, firmware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms “module,” “functionality,” and “logic” as used herein generally represent software, firmware, hardware or a combination thereof In the case of a software implementation, the module, functionality, or logic represents program code that performs specified tasks when executed on a processor (e.g., CPU or CPUs). The program code can be stored in one or more computer readable memory devices. The features of the techniques described above are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.
Patent | Priority | Assignee | Title |
10057640, | Aug 17 2015 | GOOGLE LLC | Media content migration based on user location |
11237635, | Apr 26 2017 | Cognixion Corporation | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
11402909, | Apr 26 2017 | Cognixion | Brain computer interface for augmented reality |
11561616, | Apr 26 2017 | Cognixion Corporation | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
11762467, | Apr 26 2017 | Cognixion Corporation | Nonverbal multi-input and feedback devices for user intended computer control and communication of text, graphics and audio |
9110509, | Jul 28 2010 | Vizio Inc. | System, method and apparatus for controlling presentation of content |
9639163, | Jan 30 2014 | Microsoft Technology Licensing, LLC | Content transfer involving a gesture |
Patent | Priority | Assignee | Title |
7532196, | Oct 30 2003 | Microsoft Technology Licensing, LLC | Distributed sensing techniques for mobile devices |
8055296, | Nov 06 2007 | T-MOBILE INNOVATIONS LLC | Head-up display communication system and method |
8380225, | Sep 14 2009 | Microsoft Technology Licensing, LLC | Content transfer involving a gesture |
20050231471, | |||
20060164238, | |||
20060256074, | |||
20070025293, | |||
20070146347, | |||
20070197229, | |||
20080096583, | |||
20080129686, | |||
20080152263, | |||
20080174547, | |||
20080252491, | |||
20080263460, | |||
20090075678, | |||
20100013780, | |||
20100075605, | |||
20100156812, | |||
20100178873, | |||
20100261496, | |||
20100315438, | |||
20110045839, | |||
20110065459, | |||
20110197147, | |||
WO2009033217, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Feb 12 2013 | Microsoft Corporation | (assignment on the face of the patent) | / | |||
Oct 14 2014 | Microsoft Corporation | Microsoft Technology Licensing, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 034544 | /0541 |
Date | Maintenance Fee Events |
Sep 07 2017 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Jun 23 2021 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Date | Maintenance Schedule |
Mar 18 2017 | 4 years fee payment window open |
Sep 18 2017 | 6 months grace period start (w surcharge) |
Mar 18 2018 | patent expiry (for year 4) |
Mar 18 2020 | 2 years to revive unintentionally abandoned end. (for year 4) |
Mar 18 2021 | 8 years fee payment window open |
Sep 18 2021 | 6 months grace period start (w surcharge) |
Mar 18 2022 | patent expiry (for year 8) |
Mar 18 2024 | 2 years to revive unintentionally abandoned end. (for year 8) |
Mar 18 2025 | 12 years fee payment window open |
Sep 18 2025 | 6 months grace period start (w surcharge) |
Mar 18 2026 | patent expiry (for year 12) |
Mar 18 2028 | 2 years to revive unintentionally abandoned end. (for year 12) |