A method for navigating through media content is described. The method (600) includes receiving (610) a media content stream, generating (620) a plurality of images associated with the media content stream after it is received, retrieving (630) information for a future media content stream, and providing (650) the information for the future media content stream along with the plurality of images for display along with the media content stream. The apparatus (300) includes an input signal receiver (302) that receives a media content stream, a video processor (310) that generates a plurality of images associated with the media content stream after it is received, a controller (314) retrieving information for a future media content stream that is available after the media content stream is received, and a display interface (315) providing the information for the future media content stream along with the plurality of images associated with the media content stream for display along with the media content stream.
|
20. An apparatus comprising:
a means for receiving a media content stream;
a means for storing media content associated with the received media content stream;
a means for generating a plurality of images associated with the media content stream after it is received, by extracting still video frames from the stored media content;
a means for retrieving information for a future media content stream that is available after the media content stream is received; and
a means for providing the information, on a same display screen window, for the future media content stream along with the plurality of images for display along with the media content stream, wherein the plurality of images associated with the stored media content is selectable on the same display screen window for navigating to a time in the visual timeline in the stored media content and the information for the future media content stream is also selectable on the same display screen window for providing additional information for the future media content stream.
1. A method comprising:
receiving a media content stream;
storing media content associated with the media content stream;
generating a plurality of images associated with the media content stream after it is received, by extracting still video frames from the stored media content, the plurality of images arranged in a visual timeline in order to navigate in the media content stream;
retrieving information for a future media content stream that is available after the media content stream is received; and
providing, on a same display screen window, the information for the future media content stream along with the plurality of images for display along with the media content stream, the future media content positioned at the end of the visual timeline, wherein the plurality of images associated with the stored media content is selectable on the same display screen window for navigating to a time in the visual timeline in the stored media content and the information for the future media content stream is also selectable on the same display screen window for providing additional information for the future media content stream.
14. An apparatus comprising:
an input signal receiver that receives a media content stream;
a storage device configured to store media content associated with the received media content stream;
a video processor coupled to the input signal receiver, the video processor generating a plurality of images associated with the media content stream after it is received, by extracting still video frames from the stored media content, the plurality of images arranged in a visual timeline in order to navigate in the media content stream;
a controller coupled to the video processor, the controller retrieving information for a future media content stream that is available after the media content stream is received; and
a display interface coupled to the video processor, the display interface providing, on a same display screen window, the information for the future media content stream along with the plurality of images associated with the media content stream for display along with the media content stream, the future media content positioned at the end of the visual timeline, wherein the plurality of images associated with the stored media content is selectable on the same display screen window for navigating to a time in the visual timeline in the stored media content and the information for the future media content stream is also selectable on the same display screen window for providing additional information for the future media content stream.
2. The method of
3. The method of
5. The method of
6. The method of
7. The method of
8. The method of
9. The method of
10. The method of
11. The method of
12. The method of
13. The method of
15. The apparatus of
16. The apparatus of
17. The apparatus of
18. The apparatus of
19. The apparatus of
21. The apparatus of
22. The apparatus of
23. The apparatus of
24. The apparatus of
25. The apparatus of
26. The apparatus
27. The apparatus of
28. The apparatus of
29. The apparatus of
30. The method of
31. The method of
32. The apparatus of
33. The apparatus of
34. The apparatus of
35. The apparatus of
36. The apparatus of
37. The apparatus of
38. The apparatus of
39. The apparatus of
|
This application claims the benefit, under 35 U.S.C. § 365 of International Application PCT/US13/43044, filed May 29, 2013, which was published in accordance with PCT Article 21(2) on Dec. 4, 2014 in English.
The present disclosure generally relates to systems that provide media content and further including a visual timeline associated with media content and the current channel programming. More particularly, the present disclosure is related to an apparatus and method for navigating through current media content using visual representations associated with the current media content.
This section is intended to introduce the reader to various aspects of art, which may be related to the present embodiments that are described below. This discussion is believed to be helpful in providing the reader with background information to facilitate a better understanding of the various aspects of the present disclosure. Accordingly, it should be understood that these statements are to be read in this light.
Broadcast content service providers and Internet service providers continue to find synergies within their respective content delivery systems. Many networked devices now include the ability to navigate and search through media content based on inherent capabilities from the provider of the media content or service. Networked devices, particularly those devices used in a home, are merging operations and functions associated with broadcast-centric and Internet network-centric devices. These networked devices include televisions, settop boxes, home gateways, home computer media stations, tablets, and the like. These networked devices further offer signal receiving, media recording, home networking, and Internet connectivity capabilities.
The user often has the ability to navigate through available video content in terms of what is currently being viewed and separately through what may be available for viewing. Video content navigation techniques for currently viewed content typically include features such as pause, fast forward, and rewind. These techniques and tools are often useful for navigating through previously recorded content. However, these techniques and tools may be less effective with live or currently delivered content and also lack visual cues related to the navigation through the content. For example, the user may wish to have a visual display of a timeline of the current program as is it is being displayed as well as additional information about future programming. The user may wish to rewind to a specific point in an earlier portion of a currently viewed program or to navigate in fast forward from a point in that program starting at a point that has previously been broadcast and displayed. The user may also wish to know when a currently viewed program is ending and what program will be shown following the current program. Computer video editing tools may provide related timeline management techniques for media content stored on the computer. However, these tools do not appear to work in a live video display setting and further do not include other future program identification as part of the navigation functions.
To address content that may be available for viewing broadcast content service providers often deliver a program guide that includes current and upcoming programming for the various channels in the service. A program guide allows a user to search for and navigate to other programs available from the service provider. The program guide is often downloaded to a user's home network device (e.g., gateway or settop box) through the broadcast network. The program guide contents and program information are then displayed on a display device based on requests from the user. However, the display of the program guide may not always be visually useful or ideally acceptable to the user, especially as it related to the currently viewed program. The information in the program guide is not always consistent with providing timeline of information for available programs in conjunction with the currently viewed program. As a result none of these structures, techniques, and tools addresses the needs of a user desiring a visual timeline for a currently viewed program that improves navigation and further shows future available program content. The present disclosure addresses these and other shortcomings related to media program content timeline presentation for current and future programming.
According to an aspect of the present disclosure, a method for navigating through media content is described. The method includes receiving a media content stream, generating a plurality of images associated with the media content stream after it is received, retrieving information for a future media content stream that is available after the media content stream is received, and providing the information for the future media content stream along with the plurality of images for display along with the media content stream.
According to another aspect of the present disclosure, an apparatus for navigating through media content is described. The apparatus includes an input signal receiver that receives a media content stream, a video processor coupled to the input signal receiver, the video processor generating a plurality of images associated with the media content stream after it is received, a controller coupled to the video processor, the controller retrieving information for a future media content stream that is available after the media content stream is received, and a display interface coupled to the video processor, the display interface providing the information for the future media content stream along with the plurality of images associated with the media content stream for display along with the media content stream.
These, and other aspects, features and advantages of the present disclosure will be described or become apparent from the following detailed description of the preferred embodiments, which is to be read in connection with the accompanying drawings.
It should be understood that the drawing(s) are for purposes of illustrating the concepts of the disclosure and not necessarily the only possible configuration for illustrating the disclosure.
It should be understood that the elements shown in the figures may be implemented in various forms of hardware, software or combinations thereof. Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces. Herein, the phrase “coupled” is defined to mean directly connected to or indirectly connected with through one or more intermediate components. Such intermediate components may include both hardware and software based components.
The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its scope.
All examples and conditional language recited herein are intended for educational purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term “processor” or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor (DSP) hardware, read only memory (ROM) for storing software, random access memory (RAM), and nonvolatile storage.
Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
The present embodiments address problems associated with effectively navigating program content that includes information associated with a program currently being viewed. Video content navigation tools do not always provide a simple visual interface as part of a search and navigation tool. Conventional fast forward and rewind controls do not provide any information that includes a visual timeline or any other visual navigation cues. Program guide displays may show the program content for time slots and channels in a grid as text entries. However, these conventional guides do not include visual enhancements and further do not show a visual timeline incorporates both the current program content as well as future programs. The present embodiments address these problems by creating a navigation tool for media content. The media content may be delivered over a broadcast network or other similar media service provider network. The embodiments incorporate the use of thumbnails or small visual images as identifiers for points during the current program. The embodiments further include one or more thumbnails or text entries identifying future content available on, for instance, the currently tuned channel from the network.
The present disclosure describes a program display structure that includes a portion of the display dedicated to thumbnails identifying fast forward and rewind navigation points in a visual timeline for a currently viewed program. The thumbnails represent time points in the program for the portion of the program that has already been received and/or viewed. These thumbnails also may be applied to identifying future programs planned or available. A receiving device buffers or stores the received and/or viewed content. The receiving device makes this content available for navigation features. The navigation feature makes the content navigable or browsable using a series of thumbnail images (e.g., still video frames from the content). A rewind browse feature is available by navigating to one or more thumbnail images for the content that was already displayed. Further, a fast forward browse feature is available by navigating to one or more thumbnails (e.g., poster or still video frame) at a point in the time at the end of the currently viewed program. An additional fast forward feature is available by navigating to a thumbnail or text window showing the program or programs available at the end of the current program. Timeline display and navigation may be displayed and performed in terms of time relative to the current program or in terms of absolute time. Further, each thumbnail may represent its own video clip, which may be played when the user navigates to or pauses over the thumbnail.
Turning now to
Additional information (e.g., special notices or scheduling information) or other content not provided to the broadcast affiliate manager may be delivered from content source 102 to a content manager 110. The content manager 110 may be a service provider affiliated with a content provider, broadcast service, or delivery network service. The content manager 110 may operate through an Internet website or web service. The content manager 110 may also incorporate Internet content into the delivery system. The content manager 110 may deliver the content to the user's gateway device 120 over a separate delivery network, delivery network 2 (112). Delivery network 2 (112) may include high-speed broadband Internet type communications systems. It is important to note that the content from the broadcast affiliate manager 104 may also be delivered using all or parts of delivery network 2 (112) and content from the content manager 110 may be delivered using all or parts of delivery network 1 (106). In addition, the user may also obtain content directly from the Internet via delivery network 2 (112) without necessarily having the content managed by the content manager 110.
Data server 116 receives the information from broadcast affiliate manager 104 and translates the information into a content stream suitable for delivery to a user device (e.g., client device 108). Data server 116 may include a web service for a web site or some networking site. Data server 116 may connect to delivery network 2 (112) to provide the content stream and information to the gateway device 120. Alternatively, data server 116 may include a network interface to a cellular network or other wireless delivery network and provide the content and information in a format compatibility with the wireless network directly to gateway device 120. Additionally, data server 116 may receive information from the Internet through for instance, content manager 110 and delivery network 2 (112). The additional interface permits other content, such as but not limited to information related to programs, supplemental content, and scheduling to be provided to and from data server 116 from sources other than broadcast affiliate manager 104 (e.g., other users, websites, or news agencies).
Gateway device 120 may receive different types of content from one or more of the delivery networks described earlier as well provide messages or information from devices in the user's home back to one or more of the delivery networks. Gateway device 120 processes the content and provides a separation of the content based on instructions provided with the content or over the delivery network. Gateway device 120 may also process and separate content based on instructions received via user commands communicated from client devices 108A-108N through home network 122. Gateway 120 may also provide storage, such as a hard drive or optical disk drive, for recording and/or storing the content as well as providing the content for playback through home network 122. Gateway device 120 may be a settop box, home media server, computer media station, home network gateway, multimedia player, modem, router, home network appliance, or the like.
Gateway device 120 provides the interface between the networks, operating as a wide area network (WAN), and the home network 122, operating as a local area network (LAN). Home network 122 may include both wired and wireless LAN communications. Wired communications may include physical interfaces and wiring for Ethernet, Multimedia over Coaxial cable Alliance (MoCA), or other similar signal formats. Wireless communications may include physical interfaces to accommodate one or more wireless formats including Wi-Fi, Institute of Electrical and Electronics Engineers standard IEEE 802.11 or other similar wireless communications protocols.
Home network 122 connects the gateway device 120 to client devices 108A-108N at the user's premises. Each of the client devices 108A-108N may include one or both interfaces for wired or wireless communication connection to the home network 122. Client devices 108A-108N may be a cable STB, Internet protocol (IP) STB, or satellite STB. The client devices 108A-108N may also be, but are not limited to, a computer device, tablet, display device, television, wireless phone, personal digital assistant (PDA), computers, gaming platform, remote control, multi-media player, or home networking appliance that includes both broadcast and Internet interfaces, and may further include a storage media for digital video recording. The client devices 108A-108N further process the content from gateway 120 provided through home network 122 and also process and manage user preferences and commands. The client devices 108A-108N may also include a storage device, such as a hard drive or optical disk drive, for recording and playing back audio and video content. Further details of the operation of the client devices 108A-108N and features associated with processing and playing back stored content will be described below in relation to
The processed content from client devices 108A-108A is provided to display devices 114A-114N. The display devices 114A-114N may be conventional two-dimensional (2-D) type displays or may alternatively be advanced three-dimensional (3-D) type displays. It should be appreciated that other devices having display capabilities including, but not limited to, computer devices, tablets, gateways, display devices, televisions, wireless phones, PDAs, computers, gaming platforms, remote controls, multi-media players, home networking appliances or the like, may employ the teachings of the present disclosure and are considered within the scope of the present disclosure.
It is important to note that any media programs and content as well as any information related to the media programs and content (e.g., program guides, program metadata, visual images, or trailers) may originate at a content source (e.g., content source 102) or at a server device (e.g., server 116). The content and information may be transmitted to a content manager and eventually delivered over either delivery network 1 (106) or delivery network 2 (112) to a user's home (e.g., gateway device 120 and client devices 108A-108N). Alternatively, content and information from the content source may be delivered to a data server, such as data server 116, re-formatted, and then delivered to the user's home. Still further, content and information may originate at the data server (e.g., data server 116) or at a third party source on the Internet and provided to the data server 116 for delivery to the user's home.
In operation, system 100 provides the networking and communication capability for connecting and sharing media content from the WAN between devices on a home network principally through the use of gateway device 120 and client devices 108A-108N. In one embodiment, program content received from broadcast affiliate manager 104 is buffered in gateway device 120. Gateway device 120 provides the buffered content to client device 108A in response to a user request. A user may also request a visual timeline feature through client device 108A. Gateway device 120 further provides visual images, or thumbnails, representative of the received and buffered content at predetermined time points in order to create the visual timeline feature for the currently provided program. Further, data associated with one or more programs provided from broadcast affiliate manager 104 is also received by gateway device 120 and assembled into a program guide. Gateway device 120 may store the program guide and may also search for and acquire other data (e.g., visual images, video trailers) from another source (e.g., content manager 110 or data server 116) and either store this information. The information, including visual images, for a program that follows the program on the same receiving channel is provided to client device 108A along with the visual timeline feature.
In another embodiment, data associated with programs is received by gateway device 120 and assembled into a program guide is provided from broadcast affiliate manager 104 to client device 108B through gateway device 120. Gateway device 120 may store the program guide and may also search for and acquire other data (e.g., visual images, video trailers) from another source (e.g., content manager 110 or data server 116) and either store this information, pass on to the client device 108B, or both. At another point in time, the user of client device 108A desires to view the program and may request the program using the guide through a user interface. The program is provided for display to display device 114A through client device 108A. Gateway device 120 or client device 108A may begin recording the program and generating visual representations for points in the recorded program and also identify the next program at the end of the current program. The user of client device 108A may recall the visual timeline and navigate through the timeline elements and the program based on user inputs. The visual timeline, along with user control of the visual timeline, will be described in further detail below.
Turning to
Home network system 200 operates as part of a communication network and acts to interface data and media content between the WAN portion of the communication network and one or more devices on a home network. Gateway device 210 provides the interface between the WAN, and home network 220. In one embodiment, gateway 210 provides internet protocol (IP) services (e.g., data, voice, video, and/or audio) between a cable, DSL, or satellite WAN and DMS 230. Gateway 210 also provides IP services between DMS 230 and internet destinations identified and connected through the WAN. Gateway 210 also provides IP voice services between a phone device, such as a M-DMC 250, and call destinations routed through the WAN.
Home network 220 may include wireless communication capability. A wireless interface may be included in gateway device 210 and may accommodate one or more wireless formats including Wi-Fi, Institute of Electrical and Electronics Engineers standard IEEE 802.11 or other similar wireless communication formats. A similar wireless interface may be included in DMS 230 and may also be included in DMR 240 and M-DMC 250. Home network 220 may also include wired communication capability. A wired communication interface may be included in gateway device 210 and may accommodate one or more wired formats including Ethernet, MoCA, or other similar signal protocols. A similar wired interface may be included on DMS 230 as well as on DMR 240 and M-DMC 250.
Further, both the wired and wireless interfaces and communication formats in home network 220 may support a communication protocol between devices based on the DLNA protocol. DLNA defines interoperability guidelines to enable sharing of digital media content such as music, photos and videos between devices such as computers, TVs, printers, cameras, cell phones, and other multimedia devices. DLNA uses Universal Plug and Play (UPnP) for media management, discovery and control. UPnP defines the type of device that DLNA supports (e.g., “server”, “renderer”, “controller”) and the mechanisms for accessing media over a network. The DLNA guidelines also apply a layer of restrictions over the types of media file formats, encodings and resolutions that a device must support.
DMS 230 interfaces to home network 220 and provides storage and playback capabilities for media content. DMS 230 acts as a DLNA “server” device. DMS 230 may include a user interface as well as the capability to be controlled by an external device, such as M-DMC 250. DMS 230 may also include video and audio conversion processors and functionality. DMS 230 may include a plurality of interface protocols for receiving and delivering media content to other networked devices (e.g., DMR 240 and M-DMC 250). These interface protocols include, but are not limited to DLNA, HDMI, SPDIF, and the like. DMS 230 may be a settop box, a personal computer, a network-attached storage (NAS) device, or the like.
DMR 240 interfaces to DMS 230 and provides video and audio display and reproduction capability. DMR 240 acts as a “renderer” device. DMR 240 may also include a user interface as well as the capability to be controlled by an external device, such as M-DMC 250. DMR 240 may be a television, audio/video receiver, video displays, remote speakers for music, or the like.
M-DMC 250 interfaces to both DMS 230 and DMR 240. M-DMC 250 acts as a “controller” device. Controller devices, such as M-DMC 250 discover and identify content on digital media servers (e.g., DMS 230) through a command and control interface as part of DLNA. M-DMC 250 may also direct the operation of one or more of the devices in the network. In some cases, M-DMC 250 may also provide content to the DMR 240, in addition to providing control for both DMS 230 and DMR 240. M-DMC 250 may be an Internet ready tablet device, Wi-Fi enabled digital camera, mobile cellular phone, advance remote control device, personal digital assistant, or the like.
It is important to note that one or all of the elements described in home network system 200 may be present in the same room or may be present in separate rooms at a user's premises. For example, gateway device 210 and DMS 230 may be located in a utility room, while DMR 240 and M-DMC 250 may be located in a family room. Also, more than one of the elements described in home network system 200 may be present in the user's premises and connected through home network 220 and/or operating using the DLNA protocol. For example, a second DMR may be located in a bed room and operated using M-DMC 250 with content provided through DMS 250.
Further, it is important to note that the functions described for gateway device 210 and DMS 230 may be merged into a single component or transferred and/or shared between components. For example, DMS 230 may include all functionality that is included in a gateway device, such as gateway device 210, and used as the interface to the WAN. Other devices, not shown, (e.g., computers, tablets, routers and the like), that are also connected to home network 220 would interface and communicate with DMS 230 in order to interface and communicate to the WAN.
Turning now to
Signals are interfaced to network device 300 at input signal receiver 302. Input signal receiver 302 connects to input stream processor 304. The input stream processor 304 connects to audio processor 306 and video processor 310. Audio processor 306 connects to audio interface 308, which provides the audio output signal from network device 300. Video processor 310 connects to display interface 318 which provides the video output signal from network device 300. Audio processor 306 and video processor 310 also connect to a storage device 312. A controller 314 connects to the storage device 312, as well as input stream processor 304, audio processor 306, and video processor 310. A control memory 320 connects to the controller 314. Controller 314 also connects to user interface 316 and handheld interface 322.
Media content (e.g., audio/video programs, program guide information, and program data), as part of signal interfaced with the WAN, is received in an input signal receiver 302. The input signal receiver 302 may be one or more of several known receiver circuits used for receiving, demodulation, and decoding signals provided over one of the several possible networks including over the air, cable, satellite, Ethernet, fiber and phone line networks. It is important to note that input signal receiver 302 may include receiving, demodulation, and decoding circuitry for data signals as well as media content signals delivered over either the same delivery network as the desired broadcast input signal (i.e., delivery network 1 (106)) or over a different network, (i.e., delivery network 2 (112)) and/or an alternative cellular or wireless network as described in
The decoded output signal from one or more of the circuits in input signal receiver 302 is provided to an input stream processor 304. The input stream processor 304 performs the final signal selection and processing, and includes separation of video content from audio content for the content stream. The audio content is provided to an audio processor 306 for conversion from the received format (e.g., compressed digital signal) to another format (e.g., analog waveform signal). The analog waveform signal is provided to an audio interface 308 and further to a display device, such as display devices 114A-114N described in
The video output from the input stream processor 304 is provided to a video processor 310. The video signal may be one of several formats. The video processor 310 provides, as necessary a conversion of the video content, based on the input signal format. The video processor 310 also performs any necessary conversion for the storage of the video signals.
A storage device 312 stores audio and video content received at the input. The storage device 312 allows later retrieval and playback of the content under the control of a controller 314 and also based on commands, e.g., navigation instructions such as fast-forward (FF) and rewind (Rew), received from a user interface 316. The storage device 312 may be a hard disk drive, one or more large capacity integrated electronic memories, such as static RAM (SRAM), or dynamic RAM (DRAM), an interchangeable optical disk storage system such as a compact disk drive or digital video disk drive, or storage external to, and accessible by, network device 300.
The converted video signal, from the video processor 310, either originating from the input stream processor 304 or from the storage device 312, is provided to the display interface 318. The display interface 318 further provides the display signal to a display device of the type described above. The display interface 318 may be an analog signal interface, such as red-green-blue (RGB), or may be a digital interface (e.g., HDMI).
The controller 314 is interconnected via a bus to several of the components of the network device 300, including the input stream processor 302, audio processor 306, video processor 310, storage device 312, user interface 316, and handheld interface 322. The controller 314 manages the conversion process for converting the input stream signal into a signal for storage on the storage device or for display. The controller 314 also manages the retrieval and playback of stored content.
The controller 314 is further coupled to control memory 320 (e.g., volatile or non-volatile memory, including RAM, SRAM, DRAM, ROM, programmable ROM (PROM), flash memory, electronically programmable ROM (EPROM), electronically erasable programmable ROM (EEPROM), etc.) for storing information and instruction code for controller 314. Control memory 320 may store instructions for controller 314. Control memory 320 may also store a database of elements, such as graphic elements representing still images for the displayed or recorded content. The database may be stored as a pattern of graphic elements. Alternatively, the control memory 320 may store the graphic elements in identified or grouped memory locations and use an access or location table to identify the memory locations for the various portions of information related to the graphic elements. Additional details related to the generation, storage, and display of the graphic elements will be described below. Further, the implementation of the control memory 320 may include several possible embodiments, such as a single memory device or, alternatively, more than one memory circuit communicatively connected or coupled together to form a shared or common memory. Still further, the control memory 320 may be included with other circuitry, such as portions of bus communications circuitry, in a larger circuit.
In addition to interfacing to a user interface element and a display device, network device 300 may also interface to a handheld device, such as a tablet, through handheld interface 322. This handheld device may include a display screen with additional controls or may include a touch screen. Video signals from video processor 310 as well as other data, such as the on screen display messages and message prompt returns, may be routed between controller 314 and handheld interface 322. Handheld interface 322 may transmit and receive signals and data with a handheld device or tablet using a radio frequency communications link, such as Wi-Fi, Bluetooth, or the IEEE standard 802.11. Handheld interface 322 may alternatively transmit and receive signals and data with a handheld device or tablet using an infra-red interface.
Network device 300 also includes the ability to generate and store visual image thumbnails representing still frame images at points in time in the received media content (e.g., video stream). Video processor 310, under control of controller 314, may process the received content to extract a still video frame. In an alternate embodiment, the received media content may be recorded into the storage device and the stored content retrieved and processed by video processor 310 to extract the still video frame. The extraction may occur at periodic intervals. The periodic intervals may be programmable by the user or may be set by design. In one embodiment, a time interval of ten minutes in the media content is used. The still video frames are further time indexed to the video content and stored along with the video content. The extraction of the still video frame may use any number of well known image capture techniques including, but not limited to, recovery of a compressed and encoded I-frame, decoding of an I-frame, or capturing and storing a field of frame of uncompressed video content directly into memory.
Network device 300 further includes the ability to receive and process program information (e.g., program guide data) including metadata and images associated with programs (e.g., upcoming program content to be delivered in the future). Some or all of the program information may be delivered as part of the data stream along with audio and video programs from a content service provider, such as broadcast affiliate manager 104 described in
In operation, network device 300 receives and processes media content for playback and display. Network device 300 further implements a process for managing the inputs for current and future programming available for display in order to generate a visual timeline based content navigation tool. Network device 200 provides a media program display structure that includes a portion of the display dedicated to the visual timeline feature. Thumbnail images in the timeline identify fast forward and rewind navigation capability and represent time points in the program as it is displayed. One or more of the thumbnail images also may be used to identify future programs. Network device 300 stores the received content, making at least the previously delivered content available as part of the visual timeline navigation feature.
In one embodiment, visual timeline navigation allows a quick rewind feature. Thumbnails of video frames for the currently viewed program will be generated and stored in the background in network device 300. Through a user control device, such as a remote control, a user may activate the thumbnail browsing which lists the thumbnails for the past time of the viewed program. The user can navigate to the desired thumbnail select the thumbnail. The thumbnail image may be selected by a button on the control device or through another user input. The video content will start playing from that previous point in time in the program.
In another embodiment, visual timeline navigation allows a quick fast forward feature while viewing past content. During the display and viewing of past video content accessed using the quick rewind feature described above, network device 300 continues generating and storing thumbnail images for the current program as it is received. A user may navigate to a thumbnail image that is forward in time relative to the currently displayed and viewed program. The thumbnail image may be selected by a button on the control device or through another user input. The video content will start playing from that future point in time relative to the current displayed point in the program.
In yet another embodiment, visual timeline navigation allows identification of future programs, implementing a fast forward view feature for the current program. One or more thumbnail images (e.g., program advertisement, movie or show poster, trailer) may be retrieved or generated for a future program (e.g., the next one or more programs available on the same received channel as the currently viewed program).
Alternatively, text information may be retrieved. The thumbnail image or text information is included in the visual timeline.
In a further embodiment, visual timeline navigation may allow viewing of a portion of the video content for the program in the thumbnail window. When the user is browsing the thumbnails and pauses on a particular thumbnail for a few seconds, a video clip (with/without audio) starting at that thumbnail image point in the program will begin playing in that thumbnail space.
In addition, the time indexing for the program and/or thumbnail images may use an absolute value that may be referenced to the current time. The time indexing may also use a time relative to the beginning or ending of the current program. For example, the thumbnail images may be indexed as T+/−x minutes where T is beginning or end of a program and x is the index time interval). Program indexing may also be limited or expanded to more or less than the time of one program (e.g., indexing of last one hour of programming regardless if the current program is a half hour or two hour program).
The processes described in the present disclosure may employ an input device that can be used to express functions for navigating or browsing content on a device. To allow for this, a touch panel device 400, shown in
In one embodiment, the touch panel device 400 may serve as a navigational tool to navigate the display of an electronic program guide or content display guide. In other embodiments, the touch panel device 400 may additionally serve as the display device allowing the user to more directly interact with the navigation through the grid guide showing display of content. It is important to note that the touch panel device 400 may be integrated into the settop box itself as part of, for instance, a front panel display or array. The touch panel device 400 may also be included as part of a remote control device containing more conventional control functions, such as activator or actuator buttons.
The functioning and control for generating, managing, and providing a visual timeline associated with current and future programming content using a network connected device may be encompassed as part of the operating code or firmware associated with the device (e.g., client devices 108A-108N in
Turning to
The program database component 545 provides several services and functions for managing a content database within a device (e.g., network device 300). The program database component 545 provides a persistent storage of event information (e.g., database entries and information) that is retrievable after reboot of the device. The program database component 545 further provides efficient and flexible search interface functionality on event or entry information and criteria. The search functionality may include searching for specific information related to programs (e.g., genre, time slot). The search functionality may also include searching on combinations of criteria. The program database component 545 also provides a flexible interface for providing event or entry information to the program database 540.
The program database 540 is responsible for collecting and providing a searchable interface for event or entry information. The data provided for events (e.g., media content entries) may vary significantly between different service providers and networks. The program database 540 has been designed with a very flexible search input interface and database structure that allows the operating code to accommodate significant variations in event or entry data without specific knowledge of the details of the data being stored. In addition, the inputs to program database 540 are abstracted through source plugin API 542. The program database specifies an interface through the source plugin API 542 for the initial event processing. The source plugin API 542 does not include functions or operations that are exposed to other components (e.g., recording manager 560 and content aggregrator 570).
Broadcast event source plugin 520 and broadband event source plugin 525 for interfacing event sources are two of several possible shared libraries that can be called directly by program database 540. The broadcast event source plugin 520 provides specific interfacing to the program guide data that is carried within a broadcast signal stream adhering to one of the broadcast standards (e.g., DVB, ATSC). The broadband event source plugin 525 provides specific interfacing to one or more web or Internet based content delivery services. The separate source plugin API 542 is further defined to allow event sources to register with the program database 540 and provide event information. By using a plugin model customizations may be made for a particular service or content provider in the plugin without affecting the rest of program database component 545. The source plugin API 542 also includes interfaces for other source plugins (not shown) to further permit the program database 540 to accommodate event or entry data from multiple sources.
Program database 540 also provides a set of service definitions that allow components (e.g., recording manager 560 and content aggregator 570) to search and retrieve events or entries and accompanying information. Recording manager 560 uses information from program database 540 to schedule and manage recording of content received by a device (e.g., network device 300). These recordings may include automatic recording of currently viewed program content. The recordings may also be based on user preferences or predefined conditions (e.g., scheduled or series recordings) or may be initiated by a user request. Recording manager 560 may also control video still image extraction from the recorded content using one or more well known techniques as described earlier. Recording manager 560 also receives event (e.g., program guide) updates from program database 540 to adjust the recording schedule when schedule changes occur. Recording manager 560 may maintain the various video still image and program guide components as part of a database and assemble the visual timeline associated with the program content for display based on a user request.
Content aggregator 570 gathers and further identifies content that either exists, or is available, from multiple sources interfaced to the device (e.g., network device 300). These sources include any event or program data provided by program database 540. Other sources may include previously recorded content residing on the device, downloaded content residing on the device, and user content residing on a USB stick or the home network. Content aggregator 570 provides a single interface for providing information about all available content. Content aggregator 570 may further interface to other modules or services within the software architecture, including, but not limited to, a digital living network alliance (DLNA) service, a user interface for a local display device, and a remote client running on a tablet.
Architecture 500 may be used in a device (e.g., network device 300 described in
The process described below allows a user to display a visual content timeline for current and future program content and further allow user navigation, such as fast forward, rewind, and preview features, using the visual content timeline. The process describes a program display structure that includes a portion of the display dedicated to thumbnail images identifying fast forward and rewind navigation points in a visual timeline. Thumbnail images in the timeline identify fast forward and rewind navigation capability and represent time points in the program as it is displayed. One or more of the thumbnail images may also be used to identify future programs. Network device 300 stores the received content, making at least the previously delivered content available as part of the visual timeline navigation feature. The thumbnails represent points in the program as it was received and displayed. These thumbnails also may be applied to identifying future programs planned or available. The content is stored or buffered, making at least the previously delivered content available for a rewind feature. The navigation tool makes the content navigable or browsable using the series of thumbnail images (e.g., still video frames from the content). The navigation tool allows fast forward and rewind features as well as additional video display features. The process may be used in various arrangements and program architectures, including architecture 500 described in
Turning now to
At step 610, a program that is currently being requested, received, and provided for display is recorded. The recording, at step 610, may be performed by directing an output for display from a video processor, such as video processor 310, to a local storage device, such as storage device 312, or to a separate storage device. The recording may alternatively be performed on an external device, such as gateway device 120 described in
Next, at step 620, the recorded content is processed to generate still images from the recorded content. The generating, at step 620, may include extracting a still video frame from the media content stream during the recording. The generating, at step 620, may also including extracting a still video frame from the media content after recording by retrieving the recorded content and processing the content in a video processor (e.g., video processor 310). The extraction of still video frames may use any number of well known image capture techniques including, but not limited to, recovery of a compressed and encoded I-frame, decoding of an I-frame, or capturing and storing a field of frame of uncompressed video content directly into memory.
The generation, at step 620, may occur at periodic intervals. These periodic intervals may be programmable by the user or may be set by design. In one embodiment, a time interval of ten minutes in the media content is used. These still video frames are further time indexed to the video content.
The generation, at step 620, may also include storing the extracted still images as visual elements or thumbnails to include in a visual timeline. The visual elements may be stored along with the recorded content. Alternatively, the visual elements may be stored separately as a database or pattern of elements in the same memory as the recorded content (e.g., storage device 312) or in a separate memory (e.g., control memory 320).
At step 630, program guide data that is received in the device is processed to identify program information to include in the visual timeline. For example, program information may be identified, at step 630, for the program or programs available at the conclusion of the currently viewed and displayed program. The program information may be limited to programs on the same channel as the currently viewed and displayed program. In some embodiments, the processing at step 630 may include identifying video information and visual images based on the file type or based on some other characteristic of the information. A search may be initiated to acquire additional visual data for the programming that is part of the program guide information. In other embodiments, the processing at step 630 is limited to identifying and extracting text information for the program or programs.
At step 640, a visual timeline is generated for display. The visual timeline is assembled be retrieving the visual images or thumbnails generated at step 620 and assembling the thumbnails into a graphic visual display. The thumbnails are assembled into an order based on a time index. Time indexing for the program and/or thumbnail images may use an absolute value that may be referenced to the current time. The time indexing may also use a time relative to the beginning or ending of the current program. For example, the thumbnail images may be indexed as T+/−x minutes where T is beginning or end of a program and x is the index time interval). Program indexing may also be limited or expanded to more or less than the time of one program (e.g., indexing of last one hour of programming regardless if the current program is a half hour or two hour program).
Also, at step 640, the generation of the visual timeline may include retrieving information (e.g., visual images and thumbnails or text) associated with one or more programs available at the conclusion of the currently viewed and displayed program. The information may be included and indexed to appear at the end (e.g., the far right) of the visual timeline. Further, the visual images, thumbnails, or text information may be in a graphic visual display as separate small windows. The graphic visual display may also include a window for the currently viewed and displayed program. The currently viewed and displayed program may be scaled from the normal full size on the display to a size that fits in the window in the graphic visual display.
At step 650, the visual timeline generated at step 640 is provided to a display device for display. The visual timeline may be provided to the display as a separate display element or may be provided as part of a graphic visual display including the currently viewed program. The timeline may be generated, at step 640, or provided, at step 650, as a result of a user request or user input selecting the visual timeline navigation feature. The timeline may also be generated, at step 640, or provided, at step 650, automatically when a program is viewed.
At step 660, a user input requesting a navigation feature using the visual timeline is received. The user input may be received through a user interface, such as user interface 316 or handheld interface 322, from a user control device, such as a remote control or handheld tablet. In response to the user input at step 660, at step 670, a navigation response is performed using the visual timeline. Several possible user inputs and navigation responses are possible and are described further below.
In one embodiment, a user requests, at step 660, a rewind feature. A user may activate the thumbnail browsing which lists the thumbnails for the past time of the viewed program. The user navigates or scrolls to the desired thumbnail for a previous point in time and selects the thumbnail. The thumbnail image may be selected by a button on the control device or through another user input. The video content will start playing, at step 670, from that previous point in time in the program in response to the user input.
In another embodiment, a user requests, at step 660, a fast forward from previous time feature. With the visual timeline displayed, the user navigates or scrolls to the desired thumbnail that is at a point that is ahead in time relative to the currently display program time. The video content will start playing, at step 670, from that future point in time relative to the current displayed point in the program.
In yet another embodiment, a user requests, at step 660, a fast forward identification of future program feature. With the visual timeline displayed, the user navigates or scrolls to and displays, at step 670, the one or more thumbnail images (e.g., program advertisement, movie or show poster, trailer) for a future program (e.g., the next one or more programs available on the same received channel as the currently viewed program). Alternatively, the text information may be retrieved and displayed.
In a further embodiment, a user requests, at step 660, a review or preview feature. With the visual timeline displayed, the user navigates or scrolls to the desired thumbnail image. The user hovers over the thumbnail image for a predetermined time period (e.g., two seconds) or separates selects the thumbnail image with a different user control. In response, at step 670. A video clip for the content starting at that point begins playing in the location for the thumbnail image. The thumbnail window may also enlarge in size to improve of the video clip by the user. The video clip may stop in response to another user input, such as the user moving the cursor from the time slot.
It is important to note that the steps in process 600 describe only a preferred embodiment. In other embodiments, certain steps in process 600 may be rearranged or may be eliminated. For example, in one embodiment, the generation of still video images at step 620 may occur after acquiring and processing the program guide information, at step 630. Further, although the steps in process 600 are shown as being performed serially, it is possible that one or more of the steps may be performed in parallel. For instance, generation of still video images at step 620 and acquiring and processing the program guide information, at step 630, with the results from these steps leading to the subsequent generation at step 640 directly, as described above. Further, the generation of still video at step 620 may occur while recording the content, at step 610.
Turning to
The display of the content display window 700 may be initiated by a user input (e.g., pressing a button on a remote control). Content display window 700 includes a currently viewed content window 710. Several smaller thumbnail display windows, 720, 730, 740, 750, and 760, are shown below window 710. Each of these thumbnail windows 720, 730, 740, 750, and 760 represent a point in a visual timeline related to the currently viewed content shown in window 710. The first window 720 is shown only partially in view. In some embodiments, the first and last windows may be shown only partially in view, while in other embodiments all windows may be shown fully in view. The time points for each of the thumbnail display windows 720, 730, 740, 750, and 760 may be indexed based on an absolute (e.g., current time) or may be indexed relative to the beginning or ending of the program.
In one embodiment, the currently viewed program is one hour long and the current time is 8:49 PM and the time increment between images is ten minutes. As a result, window 720 shows a thumbnail image for 8:00, window 730 shows a thumbnail image for 8:10, window 740 shows a thumbnail image for 8:20, window 750 shows a thumbnail image for 8:30, and window 740 shows a thumbnail image for 8:40.
In another embodiment, the currently viewed program has an elapsed time of 49 minutes from its beginning and the time increment is 5 minutes. As a result, window 720 shows a thumbnail image for 25 minutes elapsed time, window 730 shows a thumbnail image for 30 minutes elapsed time, window 740 shows a thumbnail image for 35 minutes elapsed time, window 750 shows a thumbnail image for 40 minutes elapsed time, and window 760 shows a thumbnail image for the program at 45 minutes elapsed time.
Although the thumbnail images are described as being generated and displayed at periodic time intervals, other generation and/or display intervals may be possible. Thumbnail images may be generated at specific time intervals and the display of the thumbnail images may be weighted in a manner to favor one time period or another. For example, the thumbnail images may be generated at 5 minute time intervals. The display of the thumbnails may show the most recent two generated thumbnail images, then may show the thumbnail image for a time point 20 minutes previous, and then may show the thumbnail image for a point at the start of the program. Other possible arrangements for the generation and display of the thumbnail images are possible.
It is important to note that although content display window include thumbnail display windows and thumbnail images for previous content, one or more thumbnail images may represent programs available at the end of the current program. For example, thumbnail display window 760 may include a thumbnail image or text information for the program that available at the end of the currently viewed program on the same broadcast channel.
Turning to
In content display window 800, window 810 shows the currently viewed and displayed program content from a received channel from a service provider (e.g., broadcast affiliate manager 104 described in
Content display window 700 and content display window 800 may also include text or other graphic symbols for the currently viewed content window or for each of the thumbnail display windows. In one embodiment, a text box indicating the time (e.g., absolute time or relative time) associated with each thumbnail image may be included below the thumbnail display windows. Graphic arrows may also be included between the thumbnail images and at each end to permit navigation between the images and scrolling to each end. In other embodiments, text indicating time may be located above the thumbnail display windows or be overlaid on the thumbnail image in an opaque or transparent text box.
It is important to note that the location and/or size of the currently viewed content windows and thumbnail display windows used in content display window 700 described in
The present disclosure describes a program display structure that includes a portion of the display dedicated to thumbnails identifying fast forward and rewind navigation points in a visual timeline for a currently viewed program. The thumbnails represent time points in the program for the portion of the program that has already been received and/or viewed. These thumbnails also may be applied to identifying future programs planned or available. A receiving device buffers or stores the received and/or viewed content. The receiving device makes this content available for navigation features. The navigation feature makes the content navigable or browsable using a series of thumbnail images (e.g., still video frames from the content). A rewind browse feature is available by navigating one or more thumbnail images for the content that was already displayed. Further, a fast forward browse feature is available by navigating one or more thumbnails (e.g., poster or still video frame) for a future program at a point in the time at the end of the currently viewed program. Timeline display and navigation may be displayed and performed in terms of time relative to the current program or in terms of absolute time. Further, each thumbnail may represent its own video clip, which may be played when the user navigates to or pauses over the thumbnail.
Although embodiments which incorporate the teachings of the present disclosure have been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings. Having described preferred embodiments of a method and apparatus for managing a media content database on a device (which are intended to be illustrative and not limiting), it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular embodiments of the disclosure disclosed which are within the scope of the disclosure as outlined by the appended claims.
Patent | Priority | Assignee | Title |
11475058, | Oct 19 2021 | Rovi Guides, Inc | Systems and methods for generating a dynamic timeline of related media content based on tagged content |
11699266, | Sep 02 2015 | INTERDIGITAL CE PATENT HOLDINGS; INTERDIGITAL CE PATENT HOLDINGS, SAS | Method, apparatus and system for facilitating navigation in an extended scene |
Patent | Priority | Assignee | Title |
8108535, | Jun 30 2006 | Quiro Holdings, Inc. | Methods, systems, and products for selecting images |
8554278, | Dec 20 2005 | Sony Mobile Communications AB | Mobile device display of multiple streamed data sources |
20050071886, | |||
20070016611, | |||
20070033632, | |||
20080306967, | |||
20100150520, | |||
20100262912, | |||
20100306801, | |||
20110271304, | |||
20120005628, | |||
20120010884, | |||
20130071095, | |||
20130091432, | |||
20130332836, | |||
20140366061, | |||
CA2632772, | |||
GB2477800, | |||
KR20120095024, | |||
WO2009136327, | |||
WO2011001762, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
May 29 2013 | INTERDIGITAL CE PATENT HOLDINGS, SAS | (assignment on the face of the patent) | / | |||
Nov 16 2015 | SELVARAJ, VIJAY ANAND | Thomson Licensing | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 037191 | /0135 | |
Jul 30 2018 | Thomson Licensing | INTERDIGITAL CE PATENT HOLDINGS | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 047332 | /0511 | |
Jul 30 2018 | Thomson Licensing | INTERDIGITAL CE PATENT HOLDINGS, SAS | CORRECTIVE ASSIGNMENT TO CORRECT THE RECEIVING PARTY NAME FROM INTERDIGITAL CE PATENT HOLDINGS TO INTERDIGITAL CE PATENT HOLDINGS, SAS PREVIOUSLY RECORDED AT REEL: 47332 FRAME: 511 ASSIGNOR S HEREBY CONFIRMS THE ASSIGNMENT | 066703 | /0509 | |
Feb 06 2020 | INTERDIGITAL CE PATENT HOLDINGS, SAS | InterDigital Madison Patent Holdings, SAS | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 053083 | /0301 |
Date | Maintenance Fee Events |
Dec 19 2023 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Jun 30 2023 | 4 years fee payment window open |
Dec 30 2023 | 6 months grace period start (w surcharge) |
Jun 30 2024 | patent expiry (for year 4) |
Jun 30 2026 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 30 2027 | 8 years fee payment window open |
Dec 30 2027 | 6 months grace period start (w surcharge) |
Jun 30 2028 | patent expiry (for year 8) |
Jun 30 2030 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 30 2031 | 12 years fee payment window open |
Dec 30 2031 | 6 months grace period start (w surcharge) |
Jun 30 2032 | patent expiry (for year 12) |
Jun 30 2034 | 2 years to revive unintentionally abandoned end. (for year 12) |