A digital signage content management system is provided that uses existing interfaces such as web interfaces and turns existing commercially available graphics programs such as web based tools or locally run programs such as Microsoft PowerPoint® into a digital signage platform to facilitate developing and managing digital signage applications through the creation of smart objects and intelligent templates that are easy to create and easy to modify to suit different applications. This enables digital signage content to be professionally created without requiring custom programming for each and every stream of new and/or changing content. The smart objects and intelligent templates can also be used to provide content with changing elements in real-time.
|
1. A method for generating objects for a digital display comprising:
enabling an object layout to be defined for presenting data within said object using an interface provided by an existing graphics program;
enabling one or more data sources to be identified for obtaining said data;
enabling behavior logic to be created for responding to changing inputs from said one or more data sources, said behavior logic being executable to self-configure a respective object to dynamically modify its output in response to at least one data trigger, without user intervention, said behavior logic interrelating said object with at least one other object to have an event related to said object trigger a change in said at least one other object;
enabling said object to be stored in a library of objects with said at least one other object to enable said objects to be available to multiple applications; and
enabling said object to be used with said at least one other object in digital display output by:
enabling said object to be added, with at least one other object from said library of objects, to a reusable template that inherits said data sources and said behavior logic to individually control said object in conjunction with said at least one other object within said template; and
executing said behavior logic for said object, during use of the template in providing the digital display output.
6. A non-transitory computer readable medium comprising computer executable instructions for generating objects for a digital display, comprising instructions for:
enabling an object layout to be defined for presenting data within said object using an interface provided by an existing available graphics program;
enabling one or more data sources to be identified for obtaining said data;
enabling behavior logic to be created for responding to changing inputs from said one or more data sources, said behavior logic being executable to self-configure a respective object to dynamically modify its output in response to at least one data trigger, without user intervention, said behavior logic interrelating said object with at least one other object to have an event related to said object trigger a change in said at least one other object;
enabling said object to be stored in a library of objects with said at least one other object to enable said objects to be available to multiple applications; and
enabling said object to be used with said at least one other object in digital display output by:
enabling said object to be added, with at least one other object from said library of objects, to a reusable template that inherits said data sources and said behavior logic to individually control said object in conjunction with said at least one other object within said template; and
executing said behavior logic for said object, during use of the template in providing the digital display output.
2. The method of
3. The method of
4. The method of
5. The method of
7. The non-transitory computer readable medium of
8. The non-transitory computer readable medium of
9. The non-transitory computer readable medium of
10. The non-transitory computer readable medium of
|
This application is a continuation of U.S. patent application Ser. No. 12/101,396 filed on Apr. 11, 2008 which claims priority from U.S. provisional application No. 60/911,572 filed on Apr. 13, 2007, both incorporated herein by reference.
The present invention relates to digital signage applications and has particular utility in automating such digital signage applications.
The market for public information displays has evolved considerably over the past years to include full motion video content combined with images and text displayed on high-resolution video graphics screens. It is becoming more and more common to see these kinds of digital signage systems in shopping centers, hotels, university campuses, and corporate lobbies. For the most part, the process of creating and managing content to be displayed on these screens has involved using standard graphics and video production tools to produce pre-rendered video clips that are then played back according to a predefined schedule or play list.
More recently, tools have become available that allow individual content elements such as graphics, animations, and video to be dynamically composited and rendered into a video stream in real-time, without requiring the need for pre-rendering all content into a single video file. This allows independent elements or layers of content to be changed “on the fly” in response to specific data conditions. In a typical example, a weather display could automatically show the latest temperature and meteorological conditions throughout the day while the latest news headlines scroll in a ticker at the bottom of the screen. As the weather changes or breaking news becomes available, the content on the screen is automatically updated.
Whereas conventional digital signage systems have used DVD players or other simple video playback systems to display pre-rendered content, real-time digital signage systems utilize more advanced computer and video hardware and specialized software to dynamically render content elements on demand. These real-time systems have the major advantage of being able to instantly update screen content in response to manual or automatic triggers, unlike the conventional video playback systems which require an entire video clip to be re-rendered every time content needs to be updated. This greatly reduces production times and network bandwidth required to distribute content for playback in multiple locations, resulting in content that is more dynamic and visually appealing to the audience.
A major drawback of real-time digital signage tools is that they require a greater amount of development effort to create video and graphics content and integrate this content with real-time data sources. This translates into higher operating costs, making the return on investment for this type of digital signage system less attractive for many applications, despite the clear benefits to the audience in terms of more interesting and engaging content.
The process of creating real-time data-driven graphics for digital signage typically requires 4 primary steps: 1) Creation of graphical and video elements by a graphic artist; 2) Development of custom software applications or scripts by a software programmer to link graphical elements to data sources; 3) Distribution of graphical elements and software applications to final play-out locations using either a local or wide-area network, or a manual distribution medium such as CD-ROM; and 4) Monitoring and updating of system elements on an ongoing basis.
The currently available tools and systems provide a means for achieving each of these steps, but for the most part require a level of specialized knowledge that the average user must acquire through extensive training and hands-on experimentation.
Embodiments of the invention will now be described by way of example only with reference to the appended drawings wherein:
following provides a digital signage content management system that uses existing interfaces such as web interfaces and turns existing commercially available graphics programs such as web based tools or locally run programs such as Microsoft PowerPoint® into a digital signage platform to facilitate developing and managing digital signage applications. The following enables digital signage content to be professionally created without requiring custom programming for each and every stream of new and/or changing content.
Referring now to
The management server 10 operates with various other elements and, in many applications, various other entities to arrange and distribute the content to the players 22 and kiosks 24 or any other entity that utilizes the content. It will be appreciated that each element shown in
The intelligent templates 34, which are explained in greater detail below, are created by an authoring entity or program 11, which utilizes a template generator 12 an existing and commercially available, or otherwise convenient or familiar graphics program 13. The template generator 12 allows the user to create templates 34 using the familiar interface and functions provided by the graphics program 13. One example of a particularly suitable graphics program is Microsoft PowerPoint®. As can be seen, a web control interface 16 (e.g. a computer connected to the program 11 over the Internet) can be used to provide a web-based development environment. It will be appreciated that the template generator 12 and graphics program 13 may also be run locally or be PC based. The template generator 12 allows users to work directly inside the graphics program 13 such as PowerPoint® to develop high end digital signage productions. Using smart objects 26 as shown in
The template generator 12 utilizes smart objects 26 to build the intelligent templates 34. In this way, certain properties and parameters defined for a smart object 26 can be inherited by the intelligent templates 34 such that by modifying an object 26, a template 34 can be modified. This allows standard objects 26 and templates 34 to be created that can change for each and every instance and use of the object 26 and template 34 for different applications. The smart objects 26 can be stored in an object library 15.
As can be seen in
The template generator 12 should be capable of dragging-and-dropping templates 34, adding unlimited layers of objects 26 anywhere on the screen, including multiple video windows, crawling data tickers, 3D animations etc. as well as enable the user to select from a library of professional, customizable templates and reusable objects. The user can create templates 34 once then automatically update video clips, images and dynamic text elements based on actions and business rules. The user can also link slide elements with information from databases, websites, RSS feeds or any other data source. Custom control or behaviour logic 32, 40 can also be defined using a scripting engine (not shown).
The content manager 14 allows the user to manage all aspects of control play-out from a single interface. This includes building play lists, updating template 34 information, defining play out schedules, and scheduling content delivery to any player 22 or group of players on the network 19. Updating templates 34 can be automated using predefined rules, making updating even complex play list content much easier and consistent, thus allowing non-skilled users to create custom signage. Graphics can be scheduled to play out at specific intervals or time-of-day; expiry dates can automatically delete slides after a specified time; templates 34, play lists and other assets can be dragged onto individual players 22 for automatic distribution; and shut down times for remote displays can be scheduled to preserve power and overall screen life. A web control 16 may provided to enable the user to manage content with web pages. This allows non-technical users to interact with the system to update specific areas without requiring knowledge of or access to the authoring tools such as the template generator 12. It will be noted that several separate web control interfaces 16 may be used (e.g. one for interacting with each element in
The players 22 and kiosks 24 are typically broadcast rendering engines that deliver high quality output. The players 22 dynamically composite slide elements in real-time to generate the final video output. In this way, individual content elements do not need to be re-rendered whenever a change is required. The players 22 can use the same engine used by TV networks etc. The players 22 can automatically download and store all content locally allowing uninterrupted play out in the event of a network disruption. Inputs from cameras, cable/satellite feeds and DVD players should be supported as well as any resolution output from analog NTSC/PAL to DV1 to HD-SD1. The players 22 can also use an auto recovery feature to allow the player 22 to self-diagnose and solve system errors without user intervention.
A remote network manager 20 may be used to manage the content and monitor network status across multiple locations. The entire network 19 can be monitored and, should a problem be detected, corrective action can be taken prior to being noticed by the users. From the remote manager 20, content can be dragged-and-dropped from a local network to one or more remote locations, network of players 22 can be browsed and viewed live from any location, email alerts can be received when a problem occurs, detailed technical stats can be viewed for any player 22, system and run logs can be viewed for any player 22, disk space and usage can be managed for all players 22 automatically, and control commands can be sent to one or more locations for automating tasks such as turning screens on or off.
The system shown in
The system introduces a concept of reusable “smart” components that include a plurality of graphics or video elements, a data layer, and a behaviour layer. These self-contained components can be used to generate a portion of a display, such as a weather or stock ticker, or an entire full-screen video output comprising multiple elements, each with its own set of data sources and individual behaviours.
The use of smart components greatly reduces the need for specialized training on the part of the end user. Whereas in prior systems a user required a certain minimum level of competency as a graphic artist or software developer, the introduction of smart components allows users without any specialized knowledge to quickly and easily create complete video graphics digital signage applications that combine real-time information sources with dynamic display characteristics.
As shown in
Smart objects 26 in this example, may include the following basic characteristics: 1) An object 26 can contain an unlimited number of graphical elements, including text, images, animations, and video; 2) Multiple objects 26 can be used simultaneously to form a composited rich media final output; 3) Each object 26 is entirely self-contained, including all of the graphical and video elements, data sources, and business rules needed to generate a final output; and 4) Objects 26 can be self-configuring, allowing the output to be dynamically modified in response to data triggers, without the need for user intervention. An example of this is a weather graphics that automatically displays a cloud animation when it is cloudy or a sun animation when it is sunny, or a financial graphic that shows a red downward pointing arrow when the stock market is down or a green up arrow when the market is up.
The smart objects 26 are considerably powerful for the end user, since it not only encompasses an object's graphical elements 28, but also the rules or behaviour logic 32 which define how the graphical elements will respond to continuously changing inputs from the data sources 30.
A typical example of using smart objects 26 involves retail displays installed in a department store. A display could be configured to display a continuous loop of video, images, and promotional text associated with the specials of that week. A smart object 26 within the display layout can be designed to integrate with the department store's inventory management system. If the inventory level for any of the items displayed on the screen falls below a minimum threshold, the object automatically switches to an alternative set of specials on items for which inventory is available. Without smart objects 26, this example would require custom software development for each screen layout that is required. With smart objects 26, the rules are defined once, and then reused again and again for any number of screen layouts. Also, the behaviour logic 32 can be used to interrelate multiple objects 26 such that an event relevant to one object 26 triggers a change in another object 26. Using the above example, in a retail environment, a change in the weather, e.g. it begins to rain can trigger a change in advertising for merchandise, e.g. rain wear or umbrellas.
To use an object 26 to create a portion of a final video output, a user can simply drag and drop the component from a browser window (organized in the template generator 12) onto the workspace, or “canvas” (provided by the graphics program 13). The component's graphical elements and layout, as well as any internal logic and business rules, are automatically added to the canvas. By dragging and dropping multiple objects 26 onto the canvas, a user can create a complete finished layout in a matter of seconds. This results in much more than a simple graphics layout. When the layout is displayed on a video screen, each of the components automatically configures itself and automatically displays live graphics and video information based on its internal logic 32 and behaviour definitions.
Typical examples of smart objects include: 1) Weather objects showing real-time weather conditions; 2) Sports tickers showing live sports results; 3) Headline tickers that continuously scroll live news information; 4) Video windows that automatically play through a loop of video content; and 5) Alert pop-ups that automatically appear in the event of a fire alarm or weather warning.
Intelligent templates 34 typically include the following characteristics: 1) Layout information defining where each individual object is located on the final output display; 2) Dynamic parameters that can be changed by the user without requiring a re-edit of the template, which can be as simple as a video filename that can be set by the user for a full screen video template, or as complex as a drop list of branding options, each of which completely redefines the entire template layout with a single click; 3) Rules defining how individual objects interact with each other; 4) Scheduling information, defining where and when each template should be displayed; 5) Expiry dates for content, allowing templates to be displayed only within a specified validity period; and 6) Business rules dictating how a template should be reconfigured based on dynamic data inputs, e.g., a single template which, when displayed in a certain location, displays video content applicable to that audience demographic, but when displayed in a different location, displays entirely different video content applicable to a different audience demographic.
Templates 34 should include everything necessary to generate a complete digital signage output, including graphical elements, video components, multiple data inputs, animations, business rules, and scheduling information 38.
Using the combination of smart objects 26 and intelligent templates 34, users can build libraries of hundreds or thousands of reusable components, which can be stored in the object library 15 and the template library 17. These libraries 15, 17 can be shared between users in the same physical location or in multiple geographic locations, e.g. through a web control interface 16. For many applications, generic default or otherwise existing objects 26 and templates 34 can be used “as is” without modification. For other applications, users can select an existing object 26 or template 34, modify the parameters of that object 26 or template 34, and save it as a new component.
A typical scenario would involve multiple levels of users: 1) Advanced users would create objects by combining graphics, data, and programming elements; 2) Intermediate users would build new digital signage projects by drawing from existing library elements; and 3) Less advanced users would manage the scheduling and custom information displayed by the objects and templates on the final display output.
Turning now to
Turning now to
Turning now to
It can therefore be seen that the above-described system can be used to provide a digital signage content management system that uses existing interfaces such as web interfaces and turns existing commercially available graphics programs such as web based tools or locally run programs such as Microsoft PowerPoint® into a digital signage platform to facilitate developing and managing digital signage applications. The following enables digital signage content to be professionally created without requiring custom programming for each and every stream of new and/or changing content.
Although the invention has been described with reference to certain specific embodiments, various modifications thereof will be apparent to those skilled in the art.
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
5339392, | Jul 27 1989 | Thomson Reuters Global Resources Unlimited Company | Apparatus and method for creation of a user definable video displayed document showing changes in real time data |
5812394, | Jul 21 1995 | Control Systems International; CONTROL SYSTEMS INTERNATIONAL, INC | Object-oriented computer program, system, and method for developing control schemes for facilities |
6055522, | Jan 29 1997 | ART TECHNOLOGY GROUP, LLC; Oracle OTC Subsidiary LLC | Automatic page converter for dynamic content distributed publishing system |
6167406, | May 08 1998 | Allen-Bradley Company, LLC | System, method and article of manufacture for building an enterprise-wide data model |
7178101, | Jun 24 2003 | ZHIGU HOLDINGS LIMITED | Content template system |
7312803, | Jun 01 2004 | X2O MEDIA INC ; X20 MEDIA INC | Method for producing graphics for overlay on a video source |
7565640, | Oct 01 2004 | Microsoft Technology Licensing, LLC | Framework for seamlessly authoring and editing workflows at design and runtime |
20040133595, | |||
20050240872, | |||
20050264583, | |||
20060247983, | |||
20060248480, | |||
20070022382, | |||
20070074268, | |||
20080065974, | |||
20080189638, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Jun 06 2008 | WILKINS, DAVID | X2O MEDIA INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 032271 | /0771 | |
Feb 21 2014 | X2O Media Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Jan 16 2020 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Feb 20 2024 | M2552: Payment of Maintenance Fee, 8th Yr, Small Entity. |
Date | Maintenance Schedule |
Nov 22 2019 | 4 years fee payment window open |
May 22 2020 | 6 months grace period start (w surcharge) |
Nov 22 2020 | patent expiry (for year 4) |
Nov 22 2022 | 2 years to revive unintentionally abandoned end. (for year 4) |
Nov 22 2023 | 8 years fee payment window open |
May 22 2024 | 6 months grace period start (w surcharge) |
Nov 22 2024 | patent expiry (for year 8) |
Nov 22 2026 | 2 years to revive unintentionally abandoned end. (for year 8) |
Nov 22 2027 | 12 years fee payment window open |
May 22 2028 | 6 months grace period start (w surcharge) |
Nov 22 2028 | patent expiry (for year 12) |
Nov 22 2030 | 2 years to revive unintentionally abandoned end. (for year 12) |