There are provided systems and methods for performing metadata extraction and management. Such a system includes a computing platform having a hardware processor, a system memory, and metadata extraction and management unit stored in the system memory. The system is configured to extract multiple metadata types from a media asset, and to aggregate the multiple metadata types to produce an aggregated metadata describing the media asset. The system is further configured to transform the aggregated metadata into at least one database entry identifying the media asset, and to map the at least one database entry into a graphical database so as to relate the media asset to at least one other media asset represented in the graphical database.
|
7. A method for use by a system including a computing platform including a hardware processor and a system memory having stored therein a metadata extraction and management unit, the method comprising:
extracting, using the hardware processor, a plurality of metadata type from a media asset;
aggregating, using the hardware processor, the plurality of metadata types to produce an aggregated metadata describing the media asset;
using the aggregated metadata to include at least one database entry in a graphical database, wherein the at least one database entry describes the media asset;
displaying a user interface for a user to view the media asset, the user interface further displaying a timeline metadata, and a plurality of horizontal bars each corresponding to a different one of tags associated with the media asset and each extending along the timeline metadata to indicate presence of the corresponding tag in the media asset; and
correcting presence of one of the tags in the media asset, in response to the user extending or reducing the horizontal bar corresponding to the one of the tags via the user interface.
1. A system comprising:
a computing platform having a hardware processor and a system memory;
a metadata extraction and management unit stored in the system memory;
wherein the hardware processor is configured to execute the metadata extraction and management unit to:
extract a plurality of metadata types from a media asset;
aggregate the plurality of metadata types to produce an aggregated metadata describing the media asset;
use the aggregated metadata to include at least one database entry in a graphical database, wherein the at least one database entry describes the media asset;
display a user interface for a user to view the media asset, the user interface further displaying a timeline metadata, and a plurality of horizontal bars each corresponding to a different one of tags associated with the media asset and each extending along the timeline metadata to indicate presence of the corresponding tag in the media asset; and
correcting presence of one of the tags in the media asset, in response to the user extending or reducing the horizontal bar corresponding to the one of the tags via the user interface.
2. The system of
3. The system of
4. The system of
5. The system of
6. The system of
8. The method of
9. The method of
10. The method of
11. The method of
12. The method of
|
Media assets, for example, movies or television (TV) programming, are typically rich in features and may require a variety of different categories of metadata to adequately describe their content. However, the conventional generation of metadata descriptive of a particular media asset is inadequate for enabling an effective comparison of content features across a library of such media assets. As a result, the conventional approach to generating and storing metadata descriptive of media assets fail to address large scale searching and cross-referencing of those media assets.
There are provided metadata extraction and management systems and methods, substantially as shown in and/or described in connection with at least one of the figures, and as set forth more completely in the claims.
The following description contains specific information pertaining to implementations in the present disclosure. One skilled in the art will recognize that the present disclosure may be implemented in a manner different from that specifically discussed herein. The drawings in the present application and their accompanying detailed description are directed to merely exemplary implementations. Unless noted otherwise, like or corresponding elements among the figures may be indicated by like or corresponding reference numerals. Moreover, the drawings and illustrations in the present application are generally not to scale, and are not intended to correspond to actual relative dimensions.
The present application describes systems and methods for extracting and managing various types of metadata corresponding to a media asset. According to implementations of the present inventive concepts, multiple types of metadata may be extracted from a media asset. Those multiple types of metadata may then be aggregated to produce an aggregated metadata describing the media asset. The aggregated metadata may be transformed into one or more database entries describing the media asset. The one or more database entries may, in turn, be mapped into a graphical database so as to relate the media asset from which the multiple types of metadata were extracted to one or more other media assets represented in the graphical database. Moreover, in some implementations, a portion of the media asset, such as a temporal block of the media asset, may be related to one or more analogous portions of other media assets represented in the graphical database. Consequently, the metadata extraction and management solution disclosed in the present application can increase the efficiency and effectiveness with which large scale searching and cross-referencing of media assets can be performed.
Referring to
It is noted that although
According to the implementation shown by
Continuing to
Metadata extraction and management unit 220 also includes aggregation module 224, and graphical database 226 receiving one or more database entries 228 from aggregation module 224. It is noted that aggregation module 224, graphical database 226, and one or more database entries 228 are described in greater detail below. Also shown in
Referring to
Media assets 150 and 152 may be media content in the form of a feature film or TV programming, for example. Moreover, metadata extractor 222 of metadata extraction and management unit 120/220 may include multiple metadata extraction modules corresponding respectively to the metadata types to be extracted from media asset 150/152. For example, metadata extractor 222 may include first through Nth metadata extraction modules 262 through 266 each specifically configured to extract metadata distinct from that extracted by any other metadata extraction module.
As a specific example, where media asset 150/152 is a feature film or TV programming content, first metadata extraction module 262 may be configured to perform shot detection in order to extract metadata describing the boundaries of substantially every shot in media asset 150/152. Second metadata extraction module 264 through Nth metadata extraction module 266 may each be configured to extract other, different types of metadata from media asset 150/152. Furthermore, first metadata extraction module 262 through Nth metadata extraction module 266 may each be configured to extract metadata from media asset 150/152 automatically, without the intervention or participation of system user 142. In addition to metadata extracted from media asset 150/152 as a result of shot detection, metadata extraction modules 262 through 266 may be configured to extract metadata through scene detection, facial recognition, speech detection, object detection, and music or soundtrack recognition, to name a few exemplary operations.
In some implementations, hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to extract the multiple metadata types from media asset 150/152 using metadata extraction modules 262 through 266 operating substantially in parallel. However, in some implementations, some or all of the metadata extraction may be performed sequentially, and may be prioritized based on metadata type. For example, in some implementations, it may be advantageous or desirable to utilize Nth metadata extraction module 266 to extract an Nth metadata type prior to utilizing first metadata extraction module 262 to extract a first metadata type. Furthermore, when sequential extraction of metadata types is preferred, selection of which of metadata extraction modules 262 through 266 is used to extract metadata from media asset 150/152 next, as well as determination of extraction parameters for the selected metadata extraction module, may be based on an earlier extracted metadata of another type.
It is noted that although the present method is described in terms of the extraction and management of metadata corresponding to one media asset 150/152, in some implementations, hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to extract and manage metadata for multiple media assets substantially concurrently. For example, instantiation 262a of first metadata extraction module 262 may be utilized to extract a first metadata type from media asset 150 while instantiation 262b is utilized to extract the first metadata type from media asset 152 substantially concurrently. Similarly, multiple instantiations of metadata extraction modules 264 through 266 may be utilized substantially concurrently to extract their respective metadata types from media assets 150 and 152.
Flowchart 370 continues with aggregating the multiple metadata types to produce an aggregated metadata describing media asset 150/152 (action 374). Hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to produce an aggregated metadata describing media asset 150/152, using aggregation module 224. As shown in
In some implementations, aggregating the multiple metadata types may include using metadata included in one metadata type to validate a metadata included in another metadata type. For example, in some implementations, it may be possible to check the accuracy of metadata included in a second metadata type using a metadata included in a fourth metadata type. In those implementations, hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to utilize aggregation module 224 to validate some or all of the metadata included in the second metadata type using metadata included in the fourth metadata type.
Flowchart 370 continues with transforming the aggregated metadata into one or more database entries 228 identifying media asset 150/152 (action 376). Hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to transform the aggregated metadata into one or more database entries 228 identifying media asset 150/152, using aggregation module 224. For example, the aggregated metadata describing media asset 150/152 may be transformed into one or more database entries 228 in the form of a graphical representation or representations identifying and describing media asset 150/152.
Flowchart 370 can conclude with mapping one or more database entries 228 into graphical database 226 so as to relate media asset 150/152 to at least one other media asset represented in graphical database 226 (action 378). Hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to map one or more database entries 228 into graphical database 226, using aggregation module 224. It is noted that relating media asset 150/152 to at least one other media asset represented in graphical database 226 may include relating merely a portion of media asset 150/152, such as a temporal block of media asset 150/152, to one or more analogous portions of other media assets represented in graphical database 226.
In some implementations, metadata extraction and management unit 120/220 may be configured to relate one or more database entries 228 corresponding to media asset 150/152 to at least one other media asset represented in graphical database 226 inferentially. Such inferential identification of a relationship between one or more database entries 228 and at least one other media asset represented in graphical database 226 may be absolute or probabilistic.
For example, in implementations in which metadata extraction and management unit 120/220 performs only absolute inferencing, one or more database entries 228 are related to other media assets only where the relationship can be established with substantial certainty. However, in some implementations, it may be advantageous or desirable to enable probabilistic inferencing by metadata extraction and management unit 120/220. In those latter implementations, metadata extraction and management unit 120/220 may be configured to inferentially relate one or more database entries 228 to another media asset or assets when the certainty of the relationship exceeds a confidence threshold of less than one hundred percent.
It is noted that, in some implementations, the method outlined in flowchart 370 may be fully automated, and may not require the participation of system user 142. In other words, system 110 may extract and manage metadata corresponding to media asset 150/152 automatically, simply as a result of ingestion of media asset 150/152 into system 110.
However, in some implementations, the systems and methods disclosed in the present application may include use of user interface 130/230, by system 110, to display the metadata types extracted as a result of action 372, and/or the aggregated metadata produced as a result of action 374, and/or one or more database entries 228, to system user 142. That is to say, hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to display the metadata types extracted as a result of action 372, and/or the aggregated metadata produced as a result of action 374, and/or one or more database entries 228, to system user 142, via user interface 130/230.
Referring to
As shown in
Thus, status bars 482 of media asset thumbnail representations 450 and 458 indicate that metadata extraction is substantially complete. By contrast, status bars 482 of media asset thumbnail representation 452 indicates that metadata extraction is in progress and at an intermediate stage of completion, while status bars 482 of media asset thumbnail representation 456 indicates that metadata extraction has not yet begun. In addition, media asset thumbnail representation 460 indicates that an error has prevented metadata extraction from occurring. Moreover, selection of media asset thumbnail representation 450 by system user 142, through use of cursor 480 for example, results in enlargement of status bars 482 to identify the specific types of metadata corresponding respectively to status bars 482.
In cases where system user 142 is an authorized knowledge base manager of system 110, hardware processor 114 may be configured to execute metadata extraction and management unit 120/220 to receive a data input from system user 142, via user interface 130/230. Such a system user provided data input may be in the form of a descriptive data further describing a media asset, and/or a corrective data for correcting one or more database entries 228. In addition, in those instances, hardware processor 114 may be further configured to execute metadata extraction and management unit 120/220 to modify one or more database entries 228 based on the data input received from system user 142 via user interface 130/230.
Referring now to
According to the implementation shown in
System user 142 may use comments field to correct the media asset entries in tags field 586, or to input descriptive data or corrective data using comments field 588. For example, system user 142 can correct the beginning/end of tags listed in tags field 586 by extending/reducing the colored horizontal bars representative of theme similarity metadata 594. In addition, in some implementations, system user 142 may create new colored horizontal bars for inclusion in theme similarity metadata 594 to manually specify new tags.
It is noted that changes introduced to tags field 586 by system user 142 may be required to conform to a controlled vocabulary for describing media asset 552, in order to facilitate later aggregation and/or search, for example. That is to say, a data input received from system user 142 via user interface 130/230 may conform to a controlled vocabulary for describing media asset 552. However, in some implementations, system user 142 may be free to make entries in comments field 588 free of the constraints imposed by such a controlled vocabulary. It is further noted that timeline metadata 590 advantageously enables system user 142 to correct or otherwise modify metadata included in one or more database entries 228 at timestamp level.
Thus, the present application describes systems and methods for performing metadata extraction and management. According to implementations of the present inventive concepts, multiple metadata types are extracted from a media asset, are aggregated, and are transformed into one or more database entries describing the media asset. The one or more database entries, in turn, are mapped into a graphical database so as to relate the media asset to at least one other media asset represented in the graphical database. Those one or more database entries can then be searched, accessed, and modified. Consequently, the metadata extraction and management solution disclosed in the present application can increase the efficiency and effectiveness with which large scale searching and indexing of media assets is performed.
From the above description it is manifest that various techniques can be used for implementing the concepts described in the present application without departing from the scope of those concepts. Moreover, while the concepts have been described with specific reference to certain implementations, a person of ordinary skill in the art would recognize that changes can be made in form and detail without departing from the scope of those concepts. As such, the described implementations are to be considered in all respects as illustrative and not restrictive. It should also be understood that the present application is not limited to the particular implementations described herein, but many rearrangements, modifications, and substitutions are possible without departing from the scope of the present disclosure.
Smolic, Aljoscha, Sigal, Leonid, Junyent Martin, Marc, Accardo, Anthony M., Narayan, Nimesh, Pont-Tuset, Jordi, Beltran, Pablo, Farre Guiu, Miguel Angel
Patent | Priority | Assignee | Title |
Patent | Priority | Assignee | Title |
6760721, | Apr 14 2000 | Intel Corporation | System and method of managing metadata data |
8150929, | Nov 27 2006 | Disney Enterprises, Inc. | Systems and methods for interconnecting media services to an interface for transport of media assets |
9833723, | Dec 31 2014 | OPENTV, INC | Media synchronized control of peripherals |
20030093434, | |||
20060242178, | |||
20080092168, | |||
20090018996, | |||
20090217352, | |||
20100050080, | |||
20100057555, | |||
20110295775, | |||
20120110043, | |||
20120158667, | |||
20140115471, | |||
20140258336, | |||
20150264296, | |||
20150317339, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Oct 05 2015 | ACCARDO, ANTHONY M | DISNEY ENTERPRISES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0270 | |
Oct 06 2015 | SIGAL, LEONID | DISNEY ENTERPRISES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0270 | |
Oct 06 2015 | PONT-TUSET, JORDI | THE WALT DISNEY COMPANY SWITZERLAND GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0601 | |
Oct 07 2015 | JUNYENT MARTIN, MARC | THE WALT DISNEY COMPANY SWITZERLAND GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0601 | |
Oct 07 2015 | SMOLIC, ALJOSCHA | THE WALT DISNEY COMPANY SWITZERLAND GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0601 | |
Oct 08 2015 | NARAYAN, NIMESH | DISNEY ENTERPRISES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0270 | |
Oct 11 2015 | BELTRAN, PABLO | THE WALT DISNEY COMPANY SWITZERLAND GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0601 | |
Oct 12 2015 | THE WALT DISNEY COMPANY SWITZERLAND GMBH | DISNEY ENTERPRISES, INC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0795 | |
Oct 12 2015 | FARRE GUIU, MIQUEL ANGEL | THE WALT DISNEY COMPANY SWITZERLAND GMBH | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 036799 | /0601 | |
Oct 15 2015 | Disney Enterprises, Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Dec 07 2021 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Date | Maintenance Schedule |
Jun 26 2021 | 4 years fee payment window open |
Dec 26 2021 | 6 months grace period start (w surcharge) |
Jun 26 2022 | patent expiry (for year 4) |
Jun 26 2024 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 26 2025 | 8 years fee payment window open |
Dec 26 2025 | 6 months grace period start (w surcharge) |
Jun 26 2026 | patent expiry (for year 8) |
Jun 26 2028 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 26 2029 | 12 years fee payment window open |
Dec 26 2029 | 6 months grace period start (w surcharge) |
Jun 26 2030 | patent expiry (for year 12) |
Jun 26 2032 | 2 years to revive unintentionally abandoned end. (for year 12) |