A system and method of contextually filtering content presented to a user on a mobile device based on contextual tagging. The user controls how content will be filtered by the mobile device by creating contextual tags and associating or tagging content with the contextual tags. The contextual tag includes a contextual behavior that is either satisfied or not based on the current context of the mobile device. During operation, content accessible to the mobile device is searched to determine which contextual tags are met based on the current context of the mobile device. content tagged with contextual tags whose behavior is currently met based on the current context of the mobile device are filtered and presented to the user. This allows the automatic presentation of a more manageable subgroup of content to the user on the mobile device based on the current context of the mobile device.
|
1. A method of filtering content, comprising the steps of:
determining a current context of a mobile device;
determining if a behavior of a contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determining if the content is to be presented on the mobile device based on the determining if the behavior is satisfied.
0. 91. A method of filtering content, comprising the steps of:
determining a current context of a mobile device;
determining if multiple behaviors of a composite contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determining if the content is to be presented on the mobile device based on the determining if the multiple behaviors are satisfied.
17. A mobile device, comprising:
a control system, comprising a microprocessor that operates to:
determine a current context of a the mobile device;
determine if a behavior of a contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determine if the content is to be presented on the mobile device based on the determining if the behavior is satisfied.
24. A non-transitory computer readable medium embodied in an article of manufacture and storing software adapted to execute on a microprocessor to:
determine a current context of a mobile device;
determine if a behavior of a contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determine if the content is to be presented on the mobile device based on the determining if the behavior is satisfied.
0. 69. A mobile device, comprising:
a control system having at least one microprocessor, wherein the control system operates to:
determine a current context of the mobile device;
determine if multiple behaviors of a composite contextual tag associated with content accessible to the mobile device are satisfied by the current context; and
determine if the content is to be presented on the mobile device based on the determining if the multiple behaviors are satisfied.
0. 76. A computer program product, the computer program product stored on a non-transitory computer-readable storage medium and including instructions configured to cause a microprocessor to carry out of the steps of:
determining a current context of a mobile device;
determining if a behavior of a contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determining if the content is to be presented on the mobile device based on the determining if the behavior is satisfied.
0. 86. A computer program product, the computer program product stored on a non-transitory computer-readable storage medium and including instructions configured to cause a microprocessor to carry out of the steps of:
determining a current context of a mobile device;
determining if multiple behaviors of a composite contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determining if the content is to be presented on the mobile device based on the determining if the multiple behaviors are satisfied.
2. The method of
3. The method of
4. The method of
wherein if a the determining if the behavior is satisfied comprises determining if multiple behaviors of the composite contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
wherein the determining if the content is to be presented on the mobile device comprises determining if the content is to be presented on the mobile device based on the determining if the multiple behaviors are satisfied.
5. The method of
6. The method of
determining if a static condition of a simple tag associated with content accessible to the mobile device is satisfied; and
based on the determining if the static condition is satisfied, determining if the content is to be presented on the mobile device.
7. The method of
8. The method of
9. The method of
10. The method of
11. The method of
wherein the determining if a the behavior is satisfied by the current context comprises determining if a the behavior of a the contextual tag associated with the content accessible to the mobile device is satisfied by the current location based on an initial location of the mobile device when the contextual tag was implicitly created.
12. The method of
wherein the determining if a the behavior is satisfied comprises determining if a the behavior of a the contextual tag associated with the content accessible to the mobile device is satisfied by the override context.
13. The method of
receiving a request to review one or more initial contexts from a plurality of the contextual tags associated with the content accessible by the mobile device;
presenting the one or more initial contexts on the mobile device; and
receiving a selection for the one or more initial contexts on the mobile device; and
presenting content on the mobile device tagged with initial contexts that match the selected one or more initial contexts.
14. The method of
15. The method of
16. The method of
18. The mobile device of
19. The mobile device of
20. The mobile device of
21. The mobile device of
22. The mobile device of
23. The mobile device of
wherein the control system determines if a behavior is satisfied by determining if a the behavior of a the contextual tag associated with the content accessible to the mobile device is satisfied by the override context.
25. The non-transitory computer readable medium of
0. 26. The mobile device of claim 20, wherein the at least one context sensor is a combination of two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 27. The mobile device of claim 20, wherein the at least one context sensor is a plurality of context sensors comprising two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 28. The mobile device of claim 20, wherein the at least one context sensor is at least one light meter.
0. 29. The mobile device of claim 20, wherein the at least one context sensor is at least one microphone.
0. 30. The mobile device of claim 20, wherein the at least one context sensor is at least one WiFi access point.
0. 31. The mobile device of claim 20, wherein the at least one context sensor is at least one 3G receiver.
0. 32. The mobile device of claim 20, wherein the at least one context sensor is at least one clock.
0. 33. The mobile device of claim 17, wherein the control system is adapted to determine the current context of the mobile device by receiving information from at least two context sensors associated with the mobile device, and wherein the at least two context sensors include a GPS sensor and a WiFi access point.
0. 34. The mobile device of claim 17, wherein the control system is adapted to determine the current context of the mobile device using near field communications.
0. 35. The mobile device of claim 17, wherein the behavior is location.
0. 36. The mobile device of claim 17, wherein the contextual tag is associated with the content persistently stored in the mobile device.
0. 37. The mobile device of claim 17, wherein the content is received from a remote source.
0. 38. The mobile device of claim 17, wherein the content is streaming content received from a remote source.
0. 39. The mobile device of claim 17, wherein the current context is based on sensing at least one of the group consisting of: internal conditions of the mobile device, environmental conditions of the mobile device, and the noise level of the environmental conditions through a microphone.
0. 40. The mobile device of claim 17, wherein the control system further operates to:
repeatedly scan the content to associate the content with contextual tags.
0. 41. The mobile device of claim 17, wherein the control system further operates to:
automatically and dynamically contextually filter content based on the current context of the mobile device.
0. 42. The mobile device of claim 17, further comprising:
a touch screen for inputting information into the mobile device.
0. 43. The mobile device of claim 17, wherein the contextual tag is assigned with built-in behaviors.
0. 44. The mobile device of claim 17, wherein the contextual tag is assigned with created behaviors.
0. 45. The mobile device of claim 17, wherein the control system further operates to:
implicitly tag content when it is being played back on the mobile device.
0. 46. The mobile device of claim 17, wherein the control system further operates to:
store the contextual tag and its associated behavior in a tag table.
0. 47. The mobile device of claim 17, wherein the control system further operates to determine if the behavior is satisfied by comparing an initial context to the current context.
0. 48. The mobile device of claim 47, wherein the initial context is sensed with at least one context sensor.
0. 49. The mobile device of claim 17, wherein the control system operates to determine if the behavior of the contextual tag associated with the content accessible to the mobile device is satisfied by the current context based on an initial context of the contextual tag.
0. 50. The mobile device of claim 49, wherein the contextual tag is a location-based contextual tag, and further wherein the initial context is an initial location of the mobile device when the contextual tag was created and the current context is a current location of the mobile device.
0. 51. The mobile device of claim 23, wherein the override context includes contextual data that was used in one of the group consisting of: the past and the recent past.
0. 52. The mobile device of claim 17, wherein the mobile device is a wireless device.
0. 53. The mobile device of claim 17, wherein the control system is adapted to perform multimedia functionality selected from the group consisting of: searching, organizing, browsing, previewing, rendering, sharing, transferring, and a combination of any of these multimedia functions.
0. 54. The mobile device of claim 17, wherein the control system is adapted to receive the content from a wired network.
0. 55. The mobile device of claim 17, wherein the control system is adapted to receive the content from a wireless network.
0. 56. The mobile device of claim 17, wherein the control system is further adapted to create contextual tags when there is no network connection.
0. 57. The mobile device of claim 17, wherein the contextual tag is a composite contextual tag, and further wherein the control system:
determines if the behavior is satisfied by determining if multiple behaviors of the composite contextual tag associated with content accessible to the mobile device is satisfied by the current context; and
determines if the content is to be presented on the mobile device by determining if the content is to be presented on the mobile device based on the determining if the multiple behaviors are satisfied.
0. 58. The mobile device of claim 17, wherein the control system is further adapted to:
for content that has no associated contextual tag, present the content.
0. 59. The mobile device of claim 17, wherein the behavior is a time of day.
0. 60. The mobile device of claim 17, wherein the behavior is a day of the week.
0. 61. The mobile device of claim 17, wherein the behavior is a date.
0. 62. The mobile device of claim 17, wherein the behavior is velocity.
0. 63. The mobile device of claim 17, wherein the behavior is acceleration.
0. 64. The mobile device of claim 17, wherein the behavior is direction of travel.
0. 65. The mobile device of claim 17, wherein the behavior is weather.
0. 66. The mobile device of claim 17, wherein the behavior is an amount of sunlight.
0. 67. The mobile device of claim 17, wherein the behavior is proximity to other users.
0. 68. The mobile device of claim 17, wherein the behavior is states of applications running on the mobile device.
0. 70. The mobile device of claim 69, wherein the control system is adapted to determine the current context of the mobile device by receiving information from a plurality of context sensors associated with the mobile device.
0. 71. The mobile device of claim 70, wherein the plurality of context sensors comprise two or more of a group of sensors consisting of: a GPS sensor, an accelerometer, a WiFi access point, a microphone, a light meter, and a clock.
0. 72. The mobile device of claim 70, wherein the plurality of context sensors comprise a context sensor that is a combination of two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 73. The mobile device of claim 69, wherein the multiple behaviors comprise two or more of the group consisting of: location, time of day, day of the week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity to other users, and states of applications running on the mobile device.
0. 74. The mobile device of claim 69, wherein the control system operates to determine if the multiple behaviors of the composite contextual tag associated with the content accessible to the mobile device are satisfied by the current context based on an initial context of the composite contextual tag, and wherein the composite contextual tag is a location-based contextual tag, the initial context is an initial location of the mobile device when the composite contextual tag was created, and the current context is a current location of the mobile device.
0. 75. The mobile device of claim 69, wherein the composite contextual tag is a location-based contextual tag and the current context is a current location of the mobile device.
0. 77. The computer program product of claim 76, wherein the instructions are configured to cause the microprocessor to further carry out the step of determining the current context of the mobile device by receiving information from at least one context sensor associated with the mobile device.
0. 78. The computer program product of claim 77, wherein the at least one context sensor comprises at least one GPS sensor.
0. 79. The computer program product of claim 77, wherein the at least one context sensor comprises a context sensor that is a combination of two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 80. The mobile device of claim 77, wherein the at least one context sensor is at least one light meter.
0. 81. The mobile device of claim 77, wherein the at least one context sensor is at least one microphone.
0. 82. The mobile device of claim 77, wherein the at least one context sensor is at least one WiFi access point.
0. 83. The mobile device of claim 77, wherein the at least one context sensor is at least one 3G receiver.
0. 84. The mobile device of claim 77, wherein the at least one context sensor is at least one clock.
0. 85. The computer program product of claim 76, wherein the behavior comprises one or more of a group consisting of: location, time of day, day of the week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity to other users, and states of applications running on the mobile device.
0. 87. The computer program product of claim 86, wherein the instructions are configured to cause the microprocessor to further carry out the step of determining the current context of the mobile device by receiving information from a plurality of context sensors associated with the mobile device.
0. 88. The computer program product of claim 87, wherein the plurality of context sensors comprise two or more of the group of sensors consisting of: a GPS sensor, an accelerometer, a WiFi access point, a microphone, a light meter, and a clock.
0. 89. The computer program product of claim 87, wherein the plurality of context sensors comprise a context sensor that is a combination of two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 90. The computer program product of claim 86, wherein the multiple behaviors comprise two or more of the group consisting of: location, time of day, day of the week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity to other users, and states of applications running on the mobile device.
0. 92. The method of claim 91, wherein the determining the current context of the mobile device comprises receiving information from a plurality of context sensors associated with the mobile device.
0. 93. The method of claim 92, wherein the plurality of context sensors comprise two or more of the group of sensors consisting of: a GPS sensor, an accelerometer, a WiFi access point, a microphone, a light meter, and a clock.
0. 94. The method of claim 92, wherein the plurality of context sensors comprise a context sensor that is a combination of two or more of the group of sensors consisting of: a GPS sensor, a light meter, a microphone, a WiFi access point, a 3G receiver, and a clock.
0. 95. The mobile device of claim 75, wherein the multiple behaviors comprise two or more of the group consisting of: location, time of day, day of the week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity to other users, and states of applications running on the mobile device.
0. 96. The mobile device of claim 75, wherein at least one behavior of the composite contextual tag is location-based.
|
The present invention relates to a system and method of filtering content, including but not limited to multimedia content, on a mobile device based on contextual tagging. Content is filtered based on whether a current context of a “contextually aware” mobile device satisfies the contextual behavior defined in a contextual tag associated with the content.
The development of small form factor, large memory capacity hard drives and other memory devices has facilitated growth of mobile devices for accessing and playing digital media. Mobile devices are particularly useful because they facilitate convenient “on-the-go” access of digital media for their users. Media content is stored in local memory in the mobile device for access by the user when desired. An example of such a mobile device is the Apple® iPOD® media player. The Apple® iPOD® media player provides gigabytes of memory storage. Media software applications, such as Apple® itunes® for example, are executed on a user's computer to store and manage the user's media library and facilitate downloading of desired media content to local memory in mobile devices.
Given the plethora of media content available, users may not have all desired media content stored on their mobile device. Thus, many mobile devices are increasingly being equipped with wireless communication capabilities. Wireless communications allow media devices to access media content not stored locally on the mobile device. Short-range wireless communication allows users to share media content with other users. Many manufacturers are also adding cellular communication capabilities to mobile devices so that media players can access media content over cellular networks from remote service providers. An example of such a mobile device is the Apple® iPhone®, which provides a combined cellular phone and media player into one mobile device.
Because of the plethora of media content available to users of mobile devices, both from locally stored and remotely accessed content, it is increasingly important to provide filtering capabilities. Without filtering, users may have to navigate through large and unmanageable media file listings to find desired media content. Filtering capabilities allow content to be provided to users in more manageable subgroups. To provide filtering, media content can be tagged with one or more static criterion that delineates the content in some manner. For example, if the media content are audio files, the audio files may include a genre tag. If an audio file is of a “Comedy” genre, the media item may be tagged with a “Comedy” genre tag in this example. Thus, if the user of the mobile device only wants to access audio files in the “Comedy” genre, the mobile device can consult the genre tag of the audio files to only provide those files having a “Comedy” genre tag.
One disadvantage of such filtering systems is that they use static-based criterion and are thus non-intelligent. The filtering criterion provided by the tag does not adapt to changes in the environment or context of the mobile device. For example, some media items tagged with a “Comedy” genre tag may be appropriate for some contexts such as home, but not for others such as a work place. Other media items may also be tagged with the “Comedy” genre tag, but may be appropriate for either home or work use. In such systems, media items tagged with “Comedy” genre tags would be filtered equally. Thus, the user may not be able to filter based on the presence of the “Comedy” genre tag effectively, because this filter may include media items that are both appropriate and inappropriate for a particular environment or context of the mobile device. If the mobile device could determine which “Comedy” media items were appropriate for which contexts on an individualized basis, the user could effectively use the “Comedy” genre filter without fear of a contextually inappropriate selection being made.
The present invention is a system and method of contextually filtering content presented to a user on a mobile device based on contextual tagging. The user controls how content will be filtered by the mobile device during operation by creating contextual tags and associating or tagging content with the contextual tags. The contextual tag includes a defined contextual behavior. The contextual behavior is an expression that is either satisfied or not based on the current context of the mobile device, a set of logical rules that apply to the current context and, an optional initial context. In this manner, the user controls the context that must exist for the mobile device in order for particular tagged content to be presented during operation. The user may use contextual tags to tag content deemed appropriate for certain contexts, but inappropriate for others. The mobile device is equipped to be “context aware.” The mobile device may use a sensed context to define the initial context of a contextual tag when created as well as the current context of the mobile device during operation. The context of the mobile device can be any condition or surrounding able to be sensed by the mobile device, including the user's interaction with the mobile device that can change and can be sensed or determined.
During operation, after contextual tags have been created and assigned to content by the user, content is searched to determine which have contextual tags whose behavior is satisfied based on the current context of the mobile device. Content tagged with contextual tags whose contextual behaviors are satisfied are presented to the user. This means the particular content was previously designated by the user to be presented based on the current context of the mobile device. Content tagged with contextual tags whose contextual behaviors are not satisfied based on the current context of the mobile device are filtered and not presented to the user. In this manner, the present invention facilitates managing and automatically being presented with a more manageable subgroup of content on the mobile device based on the context of the mobile device from the user's perspective. This is opposed to solely filtering content based on static-based criterion that does not adapt or change based on the context of the mobile device.
For example, the contextual behavior defined by a contextual tag may be location based. A location-based contextual tag may include a contextual behavior defined as the mobile device being at or in close proximity to a specified location as the initial context. How close is decided by the user via a logical expression assigned to the contextual tag that defines the desired behavior. For example, the desired behavior assigned by the user may be that the mobile device must be located within ten miles of a work place for the contextual behavior to be satisfied. When the detected current location of the mobile device (i.e. the current context) indicates that the mobile device is located within ten miles of the work place in this example (i.e. the initial context), the behavior will be satisfied. The mobile device would then make any content tagged with this location-based contextual tag available to the user.
The user can establish and associate contextual tags having any desired behavior with the content. A contextual tag may include only one contextual behavior. Alternatively, a contextual tag may include more than one contextual behavior to include a composite contextual behavior. Contextual behavior included in contextual tags can be based on any contextual attribute(s) that can be sensed by the mobile device. As examples, the contextual attributes could include conditions such as the location of the mobile device, time of day, day of week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity of the mobile device to other users, state or data of applications running on the mobile device, or combinations thereof. For contextual attributes that require sensing of the mobile device's external environment or surroundings, one or more context sensors or other hardware components which may be associated with the mobile device may be used to determine the current context of the mobile device. In this manner, the mobile device is “context aware.” Contextual behaviors can also be based on the context of the user and/or their interaction with the mobile device. For example, the user may establish contextual tags for “home” and “work” behaviors. Content assigned with a contextual tag associated with “home” behavior may not be appropriate for a “work” context, and vice versa.
The contextual tags may be established in data structures stored in association with the mobile device. These data structures may be implemented using object-oriented design (OOD) principles. OOD may be particularly well suited since it defines methods and attributes so as to associate behavior with data. For example, when a user desires to create a contextual tag, a tag factory object may be called upon to create a contextual tag object from a tag class. The tag factory may also be called upon to allow the user to create and associate one or more behavior objects with contextual tag objects. A contextual tag object does not contain any behavior evaluations. Instead, the one or more behavior objects associated with a contextual tag object are called upon. The behavior evaluations in the behavior objects are separated from the contextual tag objects to support decoupling, thus allowing easier reuse of behavior objects by other contextual tag objects. If the one or more contextual behavior objects associated with a contextual tag object are satisfied by the current context according to rules and state attributes in the behavior objects, the content tagged with the contextual tag will be made accessible by the mobile device to the user.
In an alternative embodiment, the mobile device may allow the user to manually force or override the current context even if the forced context does not naturally exist based on the current context of the mobile device. This allows a user to force the mobile device to filter content contextually based on the context desired by the user as opposed to the natural context sensed by the mobile device. For example, the user may want to access all content contextually tagged with a work location contextual tag, but when the user is on vacation. Instead of the user having to retag the content designated for a work context, the user can simply override the current context of the mobile device to force a work location context as the current context.
In another embodiment, the mobile device may be directed to implicitly contextually tag content without the user having to explicitly assign contextual tags. This allows the user to later recall content based on making a selection from previous contexts in which the user browsed and/or accessed content. For example, as a user accesses content in a normal fashion, the mobile device may automatically and silently in the background and unknown to the user contextually tag the content. If the user desires to later recall specific content, but the user can only remember the context in which the content was previously accessed, the user can review and select contextual tags assigned by the mobile device to recall content.
The mobile device employed by the present invention may be any type of mobile device, including but not limited to a cellular phone, a personal digital assistant (PDA), or a portable media player, as examples. The mobile device may or may not have communication capability. Communication capabilities may include either wired communication, wireless communication, or both. If the mobile device has communication capability, content and/or the context of the mobile device, which is used to determine if the contextual behavior of contextual tags for filtering are satisfied, both can be obtained from a remote system, such as a central content server for example.
Those skilled in the art will appreciate the scope of the present invention and realize additional aspects thereof after reading the following detailed description of the preferred embodiments in association with the accompanying drawing figures.
The accompanying drawing figures incorporated in and forming a part of this specification illustrate several aspects of the invention, and together with the description serve to explain the principles of the invention.
The embodiments set forth below represent the necessary information to enable those skilled in the art to practice the invention and illustrate the best mode of practicing the invention. Upon reading the following description in light of the accompanying drawing figures, those skilled in the art will understand the concepts of the invention and will recognize applications of these concepts not particularly addressed herein. It should be understood that these concepts and applications fall within the scope of the disclosure and the accompanying claims.
The present invention is a system and method of contextually filtering content presented to a user on a mobile device based on contextual tagging. The user controls how content will be filtered by the mobile device during operation by creating contextual tags and associating or tagging content with the contextual tags. The contextual tag includes a defined contextual behavior. The contextual behavior is an expression that is either satisfied or not based on the current context of the mobile device, a set of logical rules that apply to the current context and, an optional initial context. In this manner, the user controls the context that must be present for the mobile device in order for particular tagged content to be presented during operation. The user may use contextual tags to tag content deemed appropriate for certain contexts, but inappropriate for others. The mobile device is equipped to be “context aware.” The mobile device may use a sensed context to define the initial context of a contextual tag when created as well as the current context of the mobile device during operation. The context of the mobile device can be any condition or surrounding able to be sensed by the mobile device, including the user's interaction with the mobile device that can change and can be sensed or determined.
During operation, after contextual tags have been created and assigned to content by the user, content is searched to determine which have contextual tags whose behavior is satisfied based on the current context of the mobile device. Content tagged with contextual tags whose contextual behaviors are satisfied are presented to the user. This means the particular content was previously designated by the user to be presented based on the current context of the mobile device. Content tagged with contextual tags whose contextual behaviors are not satisfied based on the current context of the mobile device are filtered and not presented to the user. In this manner, the present invention facilitates managing and automatically being presented with a more manageable subgroup of content on the mobile device based on the context of the mobile device from the user's perspective. This is opposed to solely filtering content based on static-based criterion that does not adapt or change based on the context of the mobile device.
For example, the contextual behavior defined by a contextual tag may be location based. A location-based contextual tag may include a contextual behavior defined as the mobile device being at or in close proximity to a specified location as the initial context. How close is decided by the user via a logical expression assigned to the contextual tag that defines the desired behavior. For example, the desired behavior assigned by the user may be that the mobile device must be located within ten miles of a work place for the contextual behavior to be satisfied. When the detected current location of the mobile device (i.e. the current context) indicates that the mobile device is located within ten miles of the work place in this example (i.e. the initial context), the behavior will be satisfied. The mobile device would then make any content tagged with this location-based contextual tag available to the user.
The user can establish and associate contextual tags having any desired behavior with the content. A contextual tag may include only one contextual behavior. Alternatively, a contextual tag may include more than one contextual behavior to include a composite contextual behavior. Contextual behavior included in contextual tags can be based on any contextual attribute(s) that can be sensed by the mobile device. As examples, the contextual attributes could include conditions such as the location of the mobile device, time of day, day of week, date, velocity, acceleration, direction of travel, weather, amount of sunlight, proximity of the mobile device to other users, state or data of applications running on the mobile device, or combinations thereof. For contextual attributes that require sensing of the mobile device's external environment or surroundings, one or more context sensors or other hardware components which may be associated with the mobile device may be used to determine the current context of the mobile device. In this manner, the mobile device is “context aware.” Contextual behaviors can also be based on the context of the user and/or their interaction with the mobile device. For example, the user may establish contextual tags for “home” and “work” behaviors. Content assigned with a contextual tag associated with “home” behavior may not be appropriate for a “work” context, and vice versa.
Before any content can be tagged, the user 18 first establishes one or more contextual tags (step 30,
The user 18 next associates or tags content with established contextual tags (step 32). This content may be stored locally on the mobile device 12, or the content may be accessible over the network 22 from a remote service provider. The remote service provider may include a content server 24 that provides content 26 from a server database 28 made accessible over the network 22. The content 26 in the server database 28 may be downloaded to the mobile device 12. Alternatively, the mobile device 12 may stream content 26 in lieu of a permanent download. In any case, the user 18 does not have to associate a contextual tag with all content accessible to the mobile device 12. However, the mobile device 12 may filter out content that is not tagged with a contextual tag. This is because untagged content will not be associated with a contextual behavior that can be satisfied. Alternatively, the mobile device 12 can be configured to present any content to the user 18 that is not contextually tagged.
The user 18 continues establishing contextual tags and/or contextually tagging content until satisfied that all desired content is contextually tagged with the desired behaviors (decision 34). Once the user 18 has finished the desired contextual tagging of content, the mobile device 12 can perform contextual filtering. The mobile device 12 can first update the current context to store the most current context of the mobile device 12 before filtering begins (step 36). The current context can be based on sensing environmental or surrounding conditions of the mobile device 12, or can be based on sensing internal conditions of the mobile device 12, such as the operation of an application or the user's 18 interaction with the mobile device 12, including its applications. The mobile device 12 can next scan contextually tagged content accessible to the mobile device 12 (step 38). If content has been contextually tagged, the mobile device 12 determines if the contextual behavior associated with the contextual tag for the content is satisfied (decision 40). If so, the mobile device 12 allows the content to be presented to the user 18 (step 42). If not, the mobile device 12 filters out the content from presentation to the user 18 since the contextual behavior associated with its contextual tag is not presently satisfied (step 44). For content that is not contextually tagged and thus has no associated contextual behavior, the mobile device 12 can be configured to either automatically filter untagged content out (step 44) or present untagged content to the user 18 (step 42).
The mobile device 12 next determines if all the content accessible to the mobile device 12 has been scanned (decision 46). If not, the mobile device 12 repeats the filtering process in steps 38-44 (
To further illustrate the contextual filtering process and method discussed in
As discussed above, the mobile device 12 determines its current context to perform contextual filtering of content tagged with contextual tags according to embodiments of the present invention. The context of the mobile device 12 may be based on external conditions or surroundings, or internal conditions of the mobile device 12. For example, the current context may be based on an application executing internally on the mobile device 12 or the user's 18 interaction with the mobile device 12 and/or this application. The current context may also be based on external conditions or surroundings of the mobile device 12. In this case, the mobile device 12 may be equipped with one or more context sensors and/or other sensing devices that allow the mobile device 12 to determine its surroundings or environment. For example, if a contextual behavior associated with a particular contextual tag is location-based, the mobile device 12 needs to be able to determine its current location. The current location is used to analyze whether contextual tags having location-based behaviors are satisfied. In this regard, an exemplary mobile device 12 architecture is illustrated in
As illustrated in
A microphone 54 is another example of a contextual hardware sensor 50 that may be provided to determine noise level surrounding the mobile device 12. This may include ambient noise. Yet another example of a contextual hardware sensor 50 is a light meter 56. The mobile device 12 may include a light meter 56 to detect the surrounding light condition as part of detecting the current context. The contextual hardware sensors 50 can be any device or component that can sense a condition, surrounding, or behavior regarding the mobile device 12.
If a more precise determination of time is desired other than the presence or lack of light, a clock 58 may be provided. The clock 58 may be provided in lieu or in addition to the light meter 56. The clock 58 enables the mobile device 12 to determine the current time of day as contextual information that may be used to evaluate contextual behavior. The clock 58 may be updated using network communications via a network connectivity component 60 as is common in cellular phones. The network connectivity component 60 also allows the mobile device 12 to maintain a connection with server systems that might further aid in determining the mobile device's 12 current context. The network connectivity component 60 may also be used for downloading and/or otherwise transferring media content to and from the mobile device 12, including but not limited to the content server 24 (see
User interface components 62 may also be provided to allow the user 18 to interact with the mobile device 12. The user interface components 62 may include input devices, such as a keyboard, touch screen, or other buttons to allow the user 18 to provide input, including establishing and contextually tagging content. The user interface components 62 may also include output devices, such as a display, microphone, and/or a speaker for speech and sound output, to provide content to the user 18 in human-readable form. A software application (s) 64 may be included to drive the overall functionality of the mobile device 12, including operations based on receiving input from the user 18 and providing output to the user 18, via the user interface components 62.
A content database 66 may be provided to store content (including multimedia content) accessible to the mobile device 12. The software application 64 accesses the content database 66 to retrieve information regarding content available on the mobile device 12. The software application 64 provides this information to the user 18 via the output devices in the user interface components 62. The user 18 can then select particular content available in the content database 66 via input devices in the user interface components 62.
The content database 66 may also contain a tag table 68 to store contextual tags created by the user 18. The tag table 68 is also adapted to store the associations between the contextual tags and the content accessible by the mobile device 12 as part of the contextual tagging aspects of the present invention. The tag table 68 is consulted to determine if a contextual tag associated with content has contextual behavior that is satisfied based on the current context of the mobile device 12 as part of the contextual filtering of the present invention. One embodiment of the tag table 68 could include an instance of a contextual tag having a foreign key of the content item. The foreign key could be used as a primary key to access content from a table of content items stored in the content database 66.
Non-volatile memory (NVM) 70 can also be provided in the mobile device 12. The NVM 70 may be used to store content in the content database 66 as well as the contextual tags and their associations with content in the tag table 68 persistently across power cycles of the mobile device 12. When the mobile device 12 is powered on, the content and the contextual tags could be moved from NVM 70 to volatile storage in the content database 66 and tag table 68, including but not limited to cache memory. The NVM 70 could be solid state (NVRAM) or magnetic media (HDD), as examples.
A tag management component 72 may also be provided in the mobile device 12 to facilitate contextual tag management. Particularly, the tag management component 72 may facilitate the creation, editing, deleting, accessing, and managing of contextual tags. The tag management component 72 facilitates access to contextual tags in the tag table 68 to determine if their associated contextual behaviors are satisfied by the current context of the mobile device 12 as part of the contextual filtering provided by the present invention. The tag management component 72 also facilitates creation of contextual tags in response to user 18 requests provided via the application software 64. The tag management component 72 also facilitates user 18 requests to set up and associate one or more contextual behaviors with contextual tags. The tag management component 72 can also facilitate storing the user's 18 requests to tag content with particular contextual tags in the tag table 68 for use in contextual filtering.
In one embodiment of the present invention, object-oriented design (OOD) principles are employed in the tag management component 72 to create contextual tags and associate contextual behaviors with the contextual tags. OOD may be particularly well suited for this function since OOD defines methods and attributes so as to associate behavior with data.
If the user 18 directs the mobile device 12 to create a simple tag 74, a call may be made to a tag factory in the tag management component 72 [e.g. createSimpleTag(name: string): void]. In response, the simple tag 74 is created in the form of a simple tag object 80 according to this OOD embodiment. As illustrated in
If the user 18 directs the mobile device 12 to create a contextual tag 76, as opposed to a simple tag 74, a call may be made to a tag factory in the tag management component 72. In response, a contextual tag object 82 may be created from a tag class according to this OOD embodiment as illustrated in
Alternatively, the user 18 can direct the mobile device 12 to create a “user-defined” contextual tag. A “user-defined” contextual tag is a contextual tag 76 assigned with user-defined behaviors. The user controls the context in which content is to be presented by the mobile device 12 by defining the desired behavior. The tag factory in the tag management component 72 may be called upon to create a “user-defined” contextual tag. A contextual tag object 82 is created in response. However, unlike “built-in” contextual tags, a user 18 can define and assign user-defined behaviors with a “user-defined” contextual tag according to the mobile device 12 application state and rules, allowing open-ended functionality.
In either the case of a “built-in” or “user-defined” contextual tag, a Behavior object 84 is associated with the contextual tag object 82 and called upon to determine if the assigned behavior is satisfied based on the current context of the mobile device 12. The Behavior object 84 contains a “user-defined” behavior for “user” contextual tags as opposed to a “built-in” behavior for “built-in” contextual tags. The behaviors are expressed by logical statements that are evaluated based on the current context of the mobile device 12. The Behavior object 84 can also contain an initial context attribute if the behavior logical expression is based on a relationship between the current context of the mobile device 12 and an initial context when the contextual tag was created. For example, if the contextual tag is location-based, the logical expression in the Behavior object 84 may be based on whether the current context location of the mobile device 12 (i.e. a current context) is within a certain distance of the initial location (i.e. initial context). In this manner, the user 18 can associate the behavior to be satisfied based on the context of the mobile device 12 when the content was first tagged. For example, if the user 18 tags content with a work location contextual tag when the user 18 is at work, the work location will be stored in the initial context using the contextual sensor 50 to determine the desired location automatically for the user 18. The mobile device 12 can subsequently determine if the work contextual tag behavior is satisfied based on a comparison of the current context location with the location stored in the initial context when the contextual tag was created.
Additionally, the present invention also allows a user 18 to create a composite contextual tag 78. A composite contextual tag 78 is one which includes multiple contextual behaviors that must be satisfied. A composite contextual tag 78 is created in the OOD embodiment illustrated in
To further illustrate how contextual tags can be created according to embodiments of the present invention,
After the contextual tag object 82 and its Behavior object 84 (including defining in the initial context with the current location of the mobile device 12) is created, the newly created contextual tag is stored in the tag table 68 (step 112). An identification (ID) of the contextual tag is returned to the calling application to identify the contextual tag (step 114). The contextual tag, via the ID, can then be used to tag content with the location-based behavior of the contextual tag as desired.
Contextual tagging can be performed either explicitly or implicitly. Content can be contextually tagged explicitly as a result of the user 18 making an explicit request to the mobile device 12 via the user interface 62. Alternatively, the mobile device 12 may be directed to implicitly contextually tag content without the user 18 explicitly directing or controlling the contextual tagging of content. This allows the user 18 to later recall content based on selecting from previous contexts in which the user 18 browsed and/or accessed content on the mobile device 12. For example, as a user 18 accesses content in a normal fashion, the mobile device 12 may automatically and silently contextually tag content accessed by the user 18. If the user 18 desires to later recall specific content, but the user 18 can only remember the context in which the content was previously accessed, the user 18 can review and select contextual tags assigned by the mobile device 12 to recall content. This allows the user 18 to recall and access content by context as opposed to having to recall the identification of the content itself.
In this regard,
Turning to
As discussed above in the flow diagram examples of creating contextual tags in
If the contextual tag is of the composite behavior type, more than one behavior expression or object is provided. Thus, one or more “child” behaviors 170, 172, 174 (“Child Behavior”) may be provided in the tag table 68. The composite contextual tag has multiple behaviors that must be met based on the current context of the mobile device 12 in order for the composite contextual tag to be satisfied. Only composite contextual tags contain information in the child behavior columns 170, 172, 174. The child behaviors 170, 172, 174 also each contain a behavior data type field, an initial context state, and rules so that whether a child behavior is satisfied based on the current context of the mobile device 12 can be determined. Defining contextual tags in the form of composite contextual tags may be particularly useful if each of the desired behaviors are already defined in the tag management component 72. In this case, rather than defining a new or complex contextual behavior, the user 18 can simply assign multiple and/or existing contextual behaviors to the contextual tag. Thus, in the case of the OOD embodiment example in
After contextual tags have been created and assigned to content by the user 18, the mobile device 12 can perform contextual filtering based on the current context of the mobile device 12. In this embodiment of the present invention, the mobile device 12 contextually filters content based on the current context of the mobile device 12 for both simple and contextual tags 74, 76, 78. Simple tags can simply be defined as having a behavior always being met or true independent of the current context of the mobile device 12. Thus, the mobile device 12 can contextually filter content for all tagged content in the same manner since the behavior for simple tags is hardcoded as always being satisfied. This has the advantage of providing the same operation and algorithms for all tagged content. In this regard,
The process starts by the application software 64 requesting and receiving in response, recently used context data from the hardware sensor(s) 50 (steps 210, 212). These optional steps allow the user 18 to be presented with past contextual data to use as the hypothetical location as opposed to the user selecting a location that may have not ever been associated with a contextual tag as the initial context. The application software 64 may then provide a request to the tag management component 72 to determine if a particular location-based contextual tag associated with content is satisfied given the hypothetical location (step 214). The tag management component 72 provides a search request to find the specified contextual tag from the tag table 68 (step 216). This is because the contextual tags and their associated behaviors are stored in the tag table 68. The tag table 68 returns the contextual tag requested (step 218). The tag management component 72 then calls upon the contextual tag object 82 to determine if its behavior is satisfied or not based on the hypothetical location (step 220). The contextual tag object 82 calls upon its Behavior object 84 in response (step 222). The behavior expression in the Behavior object 84 is determined as either being satisfied or not (step 224). Note that unlike the flow diagram example in
If the Behavior object 84 is satisfied, this means that the hypothetical location satisfies the location stored as the initial context in the location-based contextual tag. The result of whether the behavior is satisfied is then sent to the contextual tag object 82 (step 226), which in turn is sent to the tag management component 72 (step 228) and the application software 64 (step 230). The application software 64 can then determine whether to present the content to the user 18 based on whether the behavior associated with the location-based contextual tag is satisfied.
Those skilled in the art will recognize improvements and modifications to the preferred embodiments of the present invention. All such improvements and modifications are considered within the scope of the concepts disclosed herein and the claims that follow.
Patent | Priority | Assignee | Title |
11948171, | May 01 2009 | H2 INTELLECT LLC | Exclusive delivery of content within geographic areas |
Patent | Priority | Assignee | Title |
6801909, | Jul 21 2000 | TRIPLEHOP TECHNOLOGIES, INC | System and method for obtaining user preferences and providing user recommendations for unseen physical and information goods and services |
7328028, | Dec 28 2001 | ABB Schweiz AG | Method and system for sending localization and identity dependent information to mobile terminals |
20020052207, | |||
20030034888, | |||
20040123242, | |||
20060200434, | |||
20060234758, | |||
20070044010, | |||
20080040301, | |||
CN1620836, | |||
CN1980449, | |||
WO2004021613, | |||
WO2005104485, | |||
WO2006116240, | |||
WO2006137993, | |||
WO2007005187, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Sep 27 2007 | PETERSON, STEVEN L | Concert Technology Corporation | CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY NAME AND ADDRESS, AND THE CONVEYANCE DOCUMENT PREVIOUSLY RECORDED ON REEL 028719 FRAME 0263 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT OF ASSIGNORS INTEREST | 029244 | /0326 | |
Sep 27 2007 | PETERSEN, STEVEN L | Domingo Enterprises, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 028719 | /0263 | |
Jan 21 2009 | Concert Technology Corporation | Domingo Enterprises, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 029238 | /0987 | |
Aug 03 2012 | Domingo Enterprises, LLC | (assignment on the face of the patent) | / | |||
May 01 2015 | Concert Technology Corporation | CONCERT DEBT, LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 036515 | /0471 | |
May 01 2015 | Domingo Enterprises, LLC | CONCERT DEBT, LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 036424 | /0174 | |
Aug 01 2015 | Concert Technology Corporation | CONCERT DEBT, LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 036515 | /0495 | |
Aug 01 2015 | Domingo Enterprises, LLC | CONCERT DEBT, LLC | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 036424 | /0087 | |
Apr 04 2017 | Domingo Enterprises, LLC | Napo Enterprises, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 042728 | /0262 | |
Dec 13 2017 | CONCERT DEBT, LLC | Concert Technology Corporation | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 044391 | /0438 | |
Dec 13 2017 | CONCERT DEBT, LLC | Domingo Enterprises, LLC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 044391 | /0373 | |
Dec 21 2017 | CONCERT DEBT, LLC | Domingo Enterprises, LLC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 044591 | /0897 | |
Dec 21 2017 | CONCERT DEBT, LLC | Concert Technology Corporation | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 044591 | /0775 | |
Dec 21 2017 | CONCERT DEBT, LLC | Napo Enterprises, LLC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 044591 | /0878 | |
Feb 05 2018 | Napo Enterprises, LLC | IP3 2017, SERIES 200 OF ALLIED SECURITY TRUST I | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 045060 | /0473 | |
Dec 21 2018 | IP3 2017, SERIES 200 OF ALLIED SECURITY TRUST I | CRIA, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048081 | /0331 | |
Jun 24 2021 | CRIA, INC | STRIPE, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 057044 | /0753 |
Date | Maintenance Fee Events |
Feb 05 2018 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Feb 03 2022 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Oct 27 2018 | 4 years fee payment window open |
Apr 27 2019 | 6 months grace period start (w surcharge) |
Oct 27 2019 | patent expiry (for year 4) |
Oct 27 2021 | 2 years to revive unintentionally abandoned end. (for year 4) |
Oct 27 2022 | 8 years fee payment window open |
Apr 27 2023 | 6 months grace period start (w surcharge) |
Oct 27 2023 | patent expiry (for year 8) |
Oct 27 2025 | 2 years to revive unintentionally abandoned end. (for year 8) |
Oct 27 2026 | 12 years fee payment window open |
Apr 27 2027 | 6 months grace period start (w surcharge) |
Oct 27 2027 | patent expiry (for year 12) |
Oct 27 2029 | 2 years to revive unintentionally abandoned end. (for year 12) |