A display device displays a tag overlaid on a video scene in a first portion of a video screen. The displayed tag is associated with content depicted in the video, includes descriptive text information, and is clickable, so that upon selection by a user, additional information associated with the tag is displayed. Based at least in part on an indication that the tag has been selected by a user, the tag undergoes vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen. The display device displays the video and the tag overlaid on the video in the second portion of the video screen. The displaying of the tag includes displaying at least a portion of the additional information associated with the tag.
|
1. A method comprising:
causing a display device to display a tag overlaid on a video scene in a first portion of a video screen, the displayed tag being associated with content depicted in the video scene, wherein the displayed tag includes descriptive text information, and wherein the displayed tag is clickable, so that upon selection by a user, additional information associated with the displayed tag is displayed;
receiving an indication that the displayed tag has been selected by a user;
based at least in part on the received indication, causing the displayed tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
in response to the repositioning, causing the display device to display the video scene in the first portion of the video screen and the displayed tag in the second portion of the video screen, wherein the displaying of the displayed tag in the second portion of the video screen includes displaying at least a portion of the additional information associated with the displayed tag.
7. An apparatus for use with a video display device, the apparatus comprising:
a hardware processor; and
a memory storing instructions that configure the hardware processor to:
cause the display device to display a tag overlaid on a video scene in a first portion of a video screen, the displayed tag being associated with content depicted in the video scene, wherein the displayed tag includes descriptive text information, and wherein the displayed tag is clickable, so that upon selection by a user, additional information associated with the tag is displayed;
receive an indication that the displayed tag has been selected by a user;
based at least in part on the received indication, cause the displayed tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
in response to the repositioning, cause the display device to display the video scene in the first portion of the video screen and the displayed tag in the second portion of the video screen, wherein the displaying of the displayed tag includes displaying at least a portion of the additional information associated with the displayed tag.
13. A non-transitory computer-readable medium having instructions stored thereon, the instructions comprising:
instructions to cause a display device to display a tag overlaid on a video scene in a first portion of a video screen, the displayed tag being associated with content depicted in the video scene, wherein the displayed tag includes descriptive text information, and wherein the displayed tag is clickable, so that upon selection by a user, additional information associated with the displayed tag is displayed;
instructions to receive an indication that the displayed tag has been selected by a user;
instructions to cause, based at least in part on the received indication, the displayed tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
instructions to cause in response to the repositioning, the display device to display the video scene in the first portion of the video screen and the displayed tag in the second portion of the video screen, wherein the displaying of the displayed tag includes displaying at least a portion of the additional information associated with the displayed tag.
2. The method of
3. The method of
causing the display device to display a second tag overlaid on the video scene in a first portion of the video screen, the displayed second tag being associated with second content depicted in the video scene, wherein the displayed second tag includes descriptive text information, and wherein the displayed second tag is clickable, so that upon selection by the user, additional information associated with the displayed second tag is displayed;
receiving an indication that the displayed second tag has been selected by the user;
based at least in part on the received indication that the displayed second tag has been selected by the user, causing the second tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
in response to the repositioning of the displayed second tag, causing the display device to display the video scene in the first portion of the video screen and the displayed second tag in the second portion of the video screen, wherein the displaying of the displayed second tag includes displaying at least a portion of the additional information associated with the displayed second tag.
4. The method of
causing the display device to display, simultaneously with the displayed tag, a visually perceptible indicator extending between a proximity where an item is depicted in the video scene and a proximity of the tag.
5. The method of
adjusting the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning and causing the display device to additionally display the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning.
6. The method of
8. The apparatus of
9. The apparatus of
cause the display device to display a second tag overlaid on the video scene in a first portion of the video screen, the displayed second tag being associated with second content depicted in the video scene, wherein the displayed second tag includes descriptive text information, and wherein the displayed second tag is clickable, so that upon selection by the user, additional information associated with the displayed second tag is displayed;
receive an indication that the displayed second tag has been selected by the user;
based at least in part on the received indication that the displayed second tag has been selected by the user, cause the displayed second tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
in response to the repositioning of the displayed second tag, cause the display device to display the video scene in the first portion of the video screen and the displayed second tag in the second portion of the video screen, wherein the displaying of the displayed second tag includes displaying at least a portion of the additional information associated with the displayed second tag.
10. The apparatus of
cause the display device to display, simultaneously with the displayed tag, a visually perceptible indicator extending between a proximity where an item is depicted in the video scene and a proximity of the displayed tag.
11. The apparatus of
adjust the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning and cause the display device to additionally display the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning.
12. The apparatus of
14. The non-transitory computer-readable medium of
15. The non-transitory computer-readable medium of
instructions to cause the display device to display a second tag overlaid on the video scene in a first portion of the video screen, the displayed second tag being associated with second content depicted in the video scene, wherein the displayed second tag includes descriptive text information, and wherein the displayed second tag is clickable, so that upon selection by the user, additional information associated with the displayed second tag is displayed;
instructions to receive an indication that the displayed second tag has been selected by the user;
instructions to cause, based at least in part on the received indication that the displayed second tag has been selected by the user, the displayed second tag to undergo vertical and/or horizontal repositioning relative to the first portion of the video screen, to a second portion of the video screen; and
instructions to cause in response to the repositioning of the displayed second tag, the display device to display the video scene in the first portion of the video screen and the displayed second tag in the second portion of the video screen, wherein the displaying of the displayed second tag includes displaying at least a portion of the additional information associated with the displayed second tag.
16. The non-transitory computer-readable medium of
instructions to cause the display device to display, simultaneously with the displayed tag, a visually perceptible indicator extending between a proximity where an item is depicted in the video scene and a proximity of the displayed tag.
17. The non-transitory computer-readable medium of
instructions to adjust the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning and to cause the display device to additionally display the visually perceptible indicator while the displayed tag overlaid on the video scene undergoes the vertical and/or horizontal repositioning.
18. The non-transitory computer-readable medium of
|
This application is a continuation of U.S. patent application Ser. No. 15/907,095, filed on Feb. 27, 2018, entitled “MOVING VIDEO TAGS,” now U.S. Pat. No. 10,187,688, which is a continuation of U.S. patent application Ser. No. 15/269,701, flied on Sep. 19, 2016, entitled “MOVING VIDEO TAGS,” now U.S. Pat. No. 9,906,829, which is a continuation of U.S. patent application Ser. No. 12/172,185, filed on Jul. 11, 2008, entitled “MOVING VIDEO TAGS OUTSIDE OF A VIDEO AREA TO CREATE A MENU SYSTEM,” now U.S. Pat. No. 9,451,195, issued on Sep. 20, 2016, which claims priority from U.S. Provisional Patent Application No. 60/949,505, filed on Jul. 12, 2007, entitled “VIDEO TAGS OUTSIDE OF VIDEO AREA.” The entire disclosures of all of the foregoing patent applications are hereby incorporated by reference herein.
This application is related to co-pending U.S. patent application Ser. No. 11/499,315, filed on Aug. 4, 2006, entitled “DISPLAYING TAGS ASSOCIATED WITH ITEMS IN A VIDEO PLAYBACK,” the entire disclosure of which is hereby incorporated by reference herein.
This application is also related to co-pending U.S. patent application Ser. No. 11/669,901 filed on Jan. 31, 2007 entitled “AUTHORING TOOL FOR PROVIDING TAGS ASSOCIATED WITH ITEMS IN A VIDEO PLAYBACK,” now U.S. Pat. No. 8,656,282, issued on Feb. 18, 2014, the entire disclosure of which is hereby incorporated by reference herein
Tags in a video area are established by displaying the tags with a visual association to an item in the video. Thereafter, the tags move to a tag menu area that is outside of the video area. Tags are selectable, such as by clicking on the tag, to cause additional actions such as displaying a web page related to the tag. Tags move and disappear/appear in conjunction with the video as the video action progresses. In one embodiment, the tag menu area is obtained when a video with a first aspect ratio is displayed in a display area that has a different aspect ratio. The difference in aspect ratios leaves a portion in the display area that is not used for the video that can be used for the tag menu area.
However, in many cases it is undesirable or impossible to change the video display area to exactly match the video aspect ratio. For example, a user may have an SD television and may wish to view an HD program. In such a case,
Similarly,
Many other types of hardware and software platforms can be used to implement the functionality described herein. For example, a video player can be included in a portable device such as a laptop, PDA, cell phone, game console, e-mail device, etc. The tag data can reside on a storage device, server, or other device that is accessed over another network. In general, the functions described can be performed by any one or more devices, processes, subsystems, or components, at the same or different times, executing at one or more locations.
Accordingly, particular embodiments can provide for computer playback of video that supports automatically capturing of screen snapshots, in the accommodation of tag information outside of a video play area. Further, while particular examples have been described herein, other structures, arrangements, and/or approaches can be utilized in particular embodiments.
Any suitable programming language can be used to implement features of the present invention including, e.g., C, C++, Java, PUI, assembly language, etc. Different programming techniques can be employed such as procedural or object oriented. The routines can execute on a single processing device or multiple processors. The order of operations described herein can be changed. Multiple steps can be performed at the same time. The flowchart sequence can be interrupted. The routines can operate in an operating system environment or as stand-alone routines occupying all, or a substantial part, of the system processing.
Steps can be performed by hardware or software, as desired. Note that steps can be added to, taken from or modified from the steps in the flowcharts presented in this specification without deviating from the scope of the invention. In general, the flowcharts are only used to indicate one possible sequence of basic operations to achieve a function.
In the description herein, numerous specific details are provided, such as examples of components and/or methods, to provide a thorough understanding of embodiments of the present invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the present invention.
As used herein the various databases, application software or network tools may reside in one or more server computers and more particularly, in the memory of such server computers. As used herein, “memory” for purposes of embodiments of the present invention may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, system or device. The memory can be, by way of example only but not by limitation, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, system, device, propagation medium, or computer memory.
A “processor” or “process” includes any human, hardware and/or software system, mechanism or component that processes data, signals or other information. A processor can include a system with a general-purpose central processing unit, multiple processing units, dedicated circuitry for achieving functionality, or other systems. Processing need not be limited to a geographic location, or have temporal limitations. For example, a processor can perform its functions in “real time,” “offline,” in a “batch mode,” etc. Portions of processing can be performed at different times and at different locations, by different (or the same) processing systems.
Reference throughout this specification to “one embodiment,” “an embodiment,” “a particular embodiment,” or “a specific embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention and not necessarily in all embodiments. Thus, respective appearances of the phrases “in one embodiment,” “in an embodiment,” “in a particular embodiment,” or “in a specific embodiment” in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any specific embodiment of the present invention may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments of the present invention described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the present invention.
Embodiments of the invention may be implemented by using a programmed general purpose digital computer, by using application specific integrated circuits, programmable logic devices, field programmable gate arrays, optical, chemical, biological, quantum or nanoengineered systems, components and mechanisms may be used. In general, the functions of the present invention can be achieved by any means as is known in the art. Further, distributed, or networked systems, components and circuits can be used. Communication, or transfer, of data may be wired, wireless, or by any other means.
It will also be appreciated that one or more of the elements depicted in the drawings/figures can also be implemented in a more separated or integrated manner, or even removed or rendered as inoperable in certain cases, as is useful in accordance with a particular application. It is also within the spirit and scope of the present invention to implement a program or code that can be stored in a machine readable medium to permit a computer to perform any of the methods described above.
Additionally, any signal arrows in the drawings/Figures should be considered only as exemplary, and not limiting, unless otherwise specifically noted. Furthermore, the term “or” as used herein is generally intended to mean “and/or” unless otherwise indicated. Combinations of components or steps will also be considered as being noted, where terminology is foreseen as rendering the ability to separate or combine is unclear.
As used in the description herein and throughout the claims that follow, “a,” “an,” and “the” includes plural references unless the context clearly dictates otherwise. Also, as used in the description herein and throughout the claims that follow, the meaning of “in” includes “in” and “on” unless the context clearly dictates otherwise.
The foregoing description of illustrated embodiments of the present invention, including what is described in the Abstract, is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the present invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the present invention in light of the foregoing description of illustrated embodiments of the present invention and are to be included within the spirit and scope of the present invention.
Thus, while the present invention has been described herein with reference to particular embodiments thereof, a latitude of modification, various changes and substitutions are intended in the foregoing disclosures, and it will be appreciated that in some instances some features of embodiments of the invention will be employed without a corresponding use of other features without departing from the scope and spirit of the invention as set forth. Therefore, many modifications may be made to adapt a particular situation or material to the essential scope and spirit of the present invention. It is intended that the invention not be limited to the particular terms used in following claims and/or to the particular embodiment disclosed as the best mode contemplated for carrying out this invention, but that the invention will include any and all embodiments and equivalents falling within the scope of the appended claims.
Patent | Priority | Assignee | Title |
11678008, | Jul 12 2007 | Gula Consulting Limited Liability Company | Moving video tags |
Patent | Priority | Assignee | Title |
10003781, | Aug 04 2006 | Gula Consulting Limited Liability Company | Displaying tags associated with items in a video playback |
10187688, | Jul 12 2007 | Gula Consulting Limited Liability Company | Moving video tags |
5680532, | Jan 29 1988 | Hitachi, Ltd. | Method and apparatus for producing animation image |
5987509, | Oct 18 1996 | Open Invention Network, LLC | System and method for displaying active uniform network resource locators during playback of a media file or media broadcast |
6546405, | Oct 23 1997 | Microsoft Technology Licensing, LLC | Annotating temporally-dimensioned multimedia content |
6580870, | Nov 28 1997 | Kabushiki Kaisha Toshiba | Systems and methods for reproducing audiovisual information with external information |
7027101, | May 13 2002 | Microsoft Technology Licensing, LLC | Selectively overlaying a user interface atop a video signal |
7133837, | Jun 29 2000 | GULA CONSULTING; Gula Consulting Limited Liability Company | Method and apparatus for providing communication transmissions |
7224401, | Jun 02 2003 | DISNEY ENTERPRISES, INC | System and method of dynamic interface placement based on aspect ratio |
7343561, | Dec 19 2003 | Apple Inc | Method and apparatus for message display |
7487112, | May 23 2002 | GULA CONSULTING; Gula Consulting Limited Liability Company | System, method, and computer program product for providing location based services and mobile e-commerce |
8073830, | Mar 31 2006 | GOOGLE LLC | Expanded text excerpts |
8285121, | Oct 07 2007 | Gula Consulting Limited Liability Company | Digital network-based video tagging system |
8412021, | May 18 2007 | Gula Consulting Limited Liability Company | Video player user interface |
8640030, | Oct 07 2007 | Gula Consulting Limited Liability Company | User interface for creating tags synchronized with a video playback |
8656282, | Jan 31 2007 | Gula Consulting Limited Liability Company | Authoring tool for providing tags associated with items in a video playback |
9008491, | May 18 2007 | Gula Consulting Limited Liability Company | Snapshot feature for tagged video |
9451195, | Jul 12 2007 | Gula Consulting Limited Liability Company | Moving video tags outside of a video area to create a menu system |
9609260, | Jul 13 2007 | Gula Consulting Limited Liability Company | Video tag layout |
9648296, | Aug 04 2006 | Gula Consulting Limited Liability Company | User control for displaying tags associated with items in a video playback |
20040098754, | |||
20040107439, | |||
20040143789, | |||
20040189868, | |||
20050086690, | |||
20050234958, | |||
20050262542, | |||
20080031590, | |||
20080034295, | |||
20080184121, | |||
20090024927, | |||
20090210778, | |||
20170243618, | |||
20180192120, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 17 2011 | KULAS, CHARLES J | Fall Front Wireless NY, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 053229 | /0432 | |
Sep 03 2015 | Fall Front Wireless NY, LLC | Gula Consulting Limited Liability Company | MERGER SEE DOCUMENT FOR DETAILS | 048256 | /0197 | |
Jan 22 2019 | Gula Consulting Limited Liabiity Company | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 22 2019 | BIG: Entity status set to Undiscounted (note the period is included in the code). |
Jul 13 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Feb 25 2023 | 4 years fee payment window open |
Aug 25 2023 | 6 months grace period start (w surcharge) |
Feb 25 2024 | patent expiry (for year 4) |
Feb 25 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Feb 25 2027 | 8 years fee payment window open |
Aug 25 2027 | 6 months grace period start (w surcharge) |
Feb 25 2028 | patent expiry (for year 8) |
Feb 25 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Feb 25 2031 | 12 years fee payment window open |
Aug 25 2031 | 6 months grace period start (w surcharge) |
Feb 25 2032 | patent expiry (for year 12) |
Feb 25 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |