Embodiments of the present invention comprise methods and systems for detecting and locating skipped frames in a test video sequence in relation to a reference video sequence. Some embodiments comprise identifying pairs of temporally aligned reference and test segments. An alignment offset and freeze-frame count associated with each segment pair is received and used to calculate a number of skipped frames between a first segment pair and a second segment pair, the second segment pair being temporally subsequent to the first segment pair. In some embodiments, a number of skipped frames between the first segment pair and the second segment pair is determined by calculating a segment offset between the two segment pairs and subtracting the segment offset value from the sum of the number of freeze frames associated with the first segment pair and all temporally intervening segment pairs between the first segment pair and the second segment pair.
|
1. A method for detecting skipped frames between a test video sequence and a reference video sequence, said method comprising:
a) identifying a first plurality of segment pairs associated with a test video sequence comprising a plurality of test frames and a reference video sequence comprising a plurality of reference frames, wherein each segment pair in the first plurality of segment pairs comprises a portion of the reference frames and a temporally aligned portion of the test frames;
b) for each segment pair in the first plurality of segment pairs, receiving an associated alignment offset and an associated freeze-frame count;
c) calculating a segment alignment offset between a first segment pair in the first plurality of segment pairs and a second segment pair in the first plurality of segment pairs, wherein the second segment pair is temporally subsequent to the first segment pair;
d) for the first segment pair and each segment pair in a second plurality of segment pairs comprising any segment pair in the first plurality of segment pairs temporally intervening between the first segment pair and the second segment pair, summing the associated freeze-frame counts; and
e) subtracting the segment alignment offset from the summed freeze-frame count to form a skipped-frame count associated with the first segment pair and the second segment pair.
2. The method as described in
a) for the first segment pair, calculating a similarity measure between corresponding frames in the first-segment-pair reference frames and the first-segment-pair temporally aligned test frames, thereby producing a first plurality of similarity measures;
b) for each segment pair in the second plurality of segment pairs, calculating a similarity measure between corresponding frames in the segment-pair reference frames and the segment-pair temporally aligned test frames, thereby producing a second plurality of similarity measures;
c) identifying discontinuities in said first plurality of similarity measures and said second plurality of similarity measures; and
d) resolving said identified discontinuities using the skipped-frame count.
3. The method as described in
4. The method as described in
5. The method as described in
a) distilling each test frame in the plurality of test frames and each reference frame in the plurality of reference frames into frame distillation measurements;
b) from the frame distillation measurements, performing a linear alignment measurement using a linear Hough transform of a local Pearson's cross-correlation coefficient (LPCCC) image for spatial alignment of the frames between the test and reference video sequences to find a best fit line through the LPCCC image; and
c) for each pixel of the LPCCC image along the best fit line, searching vertically for a higher correlation coefficient when the pixel has a value less than a threshold to find a better frame match for temporal alignment of the frames between the test and reference video sequences.
6. The method as described in
7. The method as described in
8. The method as described in
|
Embodiments of the present invention relate, in general, to methods and systems for video processing, and more particularly, to methods and systems for video quality measurements and analysis.
Video quality measurements and analysis may require each frame in a test video sequence to be played at the same time as the corresponding frame of a reference video sequence is played. This may be referred to as temporal registration of the test video sequence and the reference video sequence.
Temporal registration may be difficult to perform, either manually or by an automated method, due to differing video formats, differing frame rates, temporal distortions, temporal impairments and other differences between the test video sequence and the reference video sequence. In addition, encoding, transporting, broadcasting, distributing, decoding, and other processing of video, may contribute to one, or more, frames being skipped in a video sequence as illustrated in the exemplary reference and test sequences:
Reference Sequence:
A B C D E F G
Test Sequence:
A B C D E G H,
where like-letters denote corresponding video frames, also considered frames, between the reference sequence and the test sequence. In this example, frames A, B, C, D and E are aligned. However, because of reference-video-sequence frame F being skipped in the test video sequence, the last two frames are mismatched between the test video sequence and the reference video sequence.
Automated methods and systems to measure the number and location of skipped frames in a test video sequence relative to a reference video sequence may be desirable. In particular, it may be desirable to have measurement methods and systems that are robust in the presence of digital compression artifacts, random noise, quantization error, and other non-linear and linear distortions and interferences. Additionally, it may be desirable to have measurement methods and systems that do not require a priori knowledge of the video content, aspect ratio, DUT (Device Under Test) pixel clock, frame rates, video parameters, including relative spatial mapping, for example, horizontal scaling, vertical scaling, offset and cropping, and other video factors. Accurate and computationally efficient methods and systems may also be desirable.
Embodiments of the present invention comprise methods and systems for detecting and locating skipped frames in a test video sequence in relation to a reference video sequence. Some embodiments of the present invention comprise identifying pairs of temporally aligned reference segments and test segments. An alignment offset and freeze-frame count associated with each segment pair may be received and used to calculate a number of skipped frames between a first segment pair and a second segment pair, wherein the second segment pair is temporally subsequent to the first segment pair. In some embodiments of the present invention, a number of skipped frames between the first segment pair and the second segment pair may be determined by calculating a segment offset between the two segment pairs and subtracting the segment offset value from the sum of the number of freeze frames associated with the first segment pair and all temporally intervening segment pairs between the first segment pair and the second segment pair.
Some embodiments of the present invention comprise calculating a similarity measure between corresponding frames in segment pairs and identifying discontinuities in the calculated similarity measures. In these embodiments, the location of skipped frames may be determined by resolving the discontinuities using the skipped-frame and freeze-frame counts associated with the segment pairs.
The foregoing and other objectives, features, and advantages of the invention will be more readily understood upon consideration of the following detailed description of the invention taken in conjunction with the accompanying drawings.
Embodiments of the present invention will be best understood by reference to the drawings, wherein like parts are designated by like numerals throughout. The figures listed above are expressly incorporated as part of this detailed description.
It will be readily understood that the components of the present invention, as generally described and illustrated in the figures herein, could be arranged and designed in a wide variety of different configurations. Thus, the following more detailed description of the embodiments of the methods and systems of the present invention is not intended to limit the scope of the invention but it is merely representative of embodiments of the invention.
Embodiments of the present invention may be implemented within a test and measurement instrument. For example, embodiments of the present invention may be implemented in a video test instrument, such as a picture quality analyzer. Picture quality analyzers such as the TEKTRONIX® PQA500 may incorporate embodiments of the present invention.
Elements of embodiments of the present invention may be embodied in hardware, firmware and/or software. While exemplary embodiments revealed herein may only describe one of these forms, it is to be understood that one skilled in the art would be able to effectuate these elements in any of these forms while resting within the scope of the present invention.
Embodiments of the present invention may be used to process signals originating from video equipment. These video signals might be produced by playback equipment, such as DVD players, set-top boxes or production equipment used by broadcasters, or other content providers, prior to transmission of content to end-users.
Video quality measurements and analysis may require each frame in a test video sequence to be played at the same time as the corresponding frame of a reference video sequence is played. This may be referred to as temporal registration of the test video sequence and the reference video sequence.
Temporal registration may be difficult to perform, either manually or by an automated method, due to differing video formats, differing frame rates, temporal distortions, temporal impairments and other differences between the test video sequence and the reference video sequence. In addition, encoding, transporting, broadcasting, distributing, decoding, and other processing of video, may contribute to one, or more, frames being skipped in a video sequence as illustrated in the exemplary reference and test sequences:
Reference Sequence:
A B C D E F G
Test Sequence:
A B C D E G H,
where like-letters denote corresponding video frames between the reference sequence and the test sequence. In this example, frames A, B, C, D and E are aligned. However, because of reference-video-sequence frame F being skipped in the test video sequence, the last two frames are mismatched between the test video sequence and the reference video sequence.
Automated methods and systems to measure the number and location of skipped frames in a test video sequence relative to a reference video sequence may be desirable. In particular, it may be desirable to have measurement methods and systems that are robust in the presence of digital compression artifacts, random noise, quantization error, and other non-linear and linear distortions and interferences. Additionally, it may be desirable to have measurement methods and systems that do not require a priori knowledge of the video content, aspect ratio, DUT (Device Under Test) pixel clock, frame rates, video parameters, including relative spatial mapping, for example, horizontal scaling, vertical scaling, offset and cropping, and other video factors. Accurate and computationally efficient methods and systems may also be desirable.
Embodiments of the present invention comprise methods and system for detecting and locating skipped frames in a test video sequence, also considered a test sequence, relative to a reference video sequence, also considered a reference sequence.
Some embodiments of the present invention may be described in relation to
where FreezeFrames(s) may denote the number of freeze frames in the test-sequence segment of segment pair s and alignOS(s) may denote the alignment offset of the test-sequence segment in segment pair s. The location of the skipped frames may be determined 22 based on the number of skipped frames detected between pairs of segments. In some embodiments of the present invention, an incorrect skipped-frame count may be determined when a skipped frame occurs substantially close to the beginning of the first segment or substantially close to the end of the last segment. These outcomes may be referred to as end-point-based inconsistent results. In some embodiments of the present invention, a skipped frame that is not detected between two segment pairs may be detected between two other segment pairs.
Calculation 20 of the number of skipped frames may be understood in relation to the following example. In this illustrative example, a reference sequence comprising a sequence of reference frames and a test sequence comprising a sequence of test frames may be denoted:
A B C D E F G H I J K L M N O P Q R S T U V W X Y
Z a b c d e f g h i j k l m
and
A B C D E F I J K L M N N O P R S T U V W X Y Z a
b c d d d e f g h i j k l m,
respectively, wherein frames G, H and Q are skipped in the test sequence in relation to the reference sequence, and frames N and d in the test sequence are freeze frames with freeze-frame occurrences of one and two, respectively. Exemplary segment alignments may be given by the segment pairs:
Segment pair 0:
s = 0
Reference-sequence
A B C D E F G H
segment:
Test-sequence
A B C D E F I J
segment:
FreezeFrames (0) = 0 and alignOS (0) = 0
Segment pair 1:
s = 1
Reference-sequence
I J K L M N O P
segment:
Test-sequence
I J K L M N N O
segment:
FreezeFrames (1) = 1 and alignOS (1) = −2
Segment pair 2:
s = 2
Reference-sequence
Q R S T U V W X
segment:
Test-sequence
P R S T U V W X
segment:
FreezeFrames (2) = 0 and alignOS (2) = −2
Segment pair 3:
s = 3
Reference-sequence
Y Z a b c d e f
segment:
Test-sequence
Y Z a b c d d d
segment:
FreezeFrames (3) = 2 and alignOS (3) = −2
Segment pair 4:
s = 4
Reference-sequence
g h i j k l m
segment:
Test-sequence
g h i j k l m
segment:
FreezeFrames (4) = 0 and alignOS (4) = 0.
The number of freeze frames and alignment offsets may be used to calculate 20 the number of skipped frames between segment pairs. The number of freeze frames and alignment information is summarized in Table 1 for this example.
TABLE 1
Segment Summary for Example
ALIGNMENT
NUMBER OF
OFFSET
FREEZE FRAMES
SEGMENT s
alignOS(s)
FreezeFrames(s)
0
0
0
1
−2
1
2
−2
0
3
−2
2
4
0
0
The number of skipped frames between, at least, the middle of Segment pair 0 and, at least, the middle of Segment pair 1 may be calculated according to:
Thus, the number of skipped frames is determined to be two, which is consistent since reference frames G and H are skipped in the test segment.
The number of skipped frames between, at least, the middle of Segment pair 0 and, at least, the middle of Segment pair 2 may be calculated according to:
Thus, the number of skipped frames is determined to be three, which is consistent since reference frames G, H and Q are skipped in the test segment.
The number of skipped frames between, at least, the middle of Segment pair 0 and, at least, the middle of Segment pair 3 may be calculated according to:
Thus, the number of skipped frames is determined to be three, which is consistent since reference frames G, H and Q are skipped in the test segments.
The number of skipped frames between, at least, the middle of Segment pair 0 and, at least, the middle of Segment pair 4 may be calculated according to:
Thus, the number of skipped frames is determined to be three, which is consistent since reference frames G, H and Q are skipped in the test segments.
The number of skipped frames between, at least, the middle of Segment pair 1 and, at least, the middle of Segment pair 2 may be calculated according to:
Thus, the number of skipped frames is determined to be one, which is consistent since reference frame Q is skipped in the test segment.
The number of skipped frames between, at least, the middle of Segment pair 1 and, at least, the middle of Segment pair 3 may be calculated according to:
Thus, the number of skipped frames is determined to be one, which is consistent since reference frame Q is skipped in the test segments.
The number of skipped frames between, at least, the middle of Segment pair 1 and, at least, the middle of Segment pair 4 may be calculated according to:
Thus, the number of skipped frames is determined to be one, which is consistent since reference frame Q is skipped in the test segments.
The number of skipped frames between, at least, the middle of Segment pair 2 and, at least, the middle of Segment pair 3 may be calculated according to:
Thus, the number of skipped frames is determined to be zero, which is inconsistent due to end-point conditions.
The number of skipped frames between, at least, the middle of Segment pair 2 and, at least, the middle of Segment pair 4 may be calculated according to:
Thus, the number of skipped frames is determined to be zero, which is inconsistent due to end-point conditions.
The number of skipped frames between, at least, the middle of Segment pair 3 and, at least, the middle of Segment pair 4 may be calculated according to:
Thus, the number of skipped frames is determined to be zero, which is consistent since no frames are skipped in the test segment.
In some embodiments of the present invention, temporal alignment 14, offset determination 16 and freeze-frame determination 18 for each segment pair may be performed according to methods and systems developed by Kevin M. Ferguson, the present inventor, and described in U.S. patent application Ser. No. 12/104,380, hereinafter the '380 application, entitled “Systems and Methods for Robust Video Temporal Registration,” filed on Apr. 16, 2008, and which is hereby incorporated herein by reference in its entirety. The '380 application claims the benefit of U.S. Provisional Patent Application Ser. No. 60/912,167 entitled “Systems and Methods for Robust Video Temporal Registration,” filed on Apr. 16, 2007, and which is hereby incorporated herein by reference in its entirety. In these embodiments, the methods and systems of the '380 application may be applied successively using each segment result to initialize the next search.
In alternative embodiments of the present invention, temporal alignment 14, offset determination 16 and freeze-frame determination 18 may be performed according to alternative methods. Exemplary alternative methods may include manual methods, automated methods and other methods known in the art.
In some embodiments of the present invention described in relation to
In some embodiments of the present invention described in relation to
In some embodiments of the present invention, discontinuity resolution 34, 46 may comprise alignment compensation for freeze frames.
In some embodiments of the present invention, multiple skipped frames per segment pair may be located by alignment adjustment of the segments within the segment pair to favor the portion with the poorest alignment at a previously determined skipped-frame discontinuity and successively locating, according to the above-described embodiments, additional skipped frames.
Discontinuity resolution according to some embodiments of the present invention may be understood in relation to the following examples depicted in
The above-described examples are intended to illustrate discontinuity resolution, and are not intended to be considered an exhaustive description of rules for discontinuity resolution.
The terms and expressions which have been employed in the foregoing specification are used therein as terms of description and not of limitation, and there is no intention in the use of such terms and expressions of excluding equivalence of the features shown and described or portions thereof, it being recognized that the scope of the invention is defined and limited only by the claims which follow.
Patent | Priority | Assignee | Title |
11064204, | May 15 2014 | ARRIS ENTERPRISES LLC | Automatic video comparison of the output of a video decoder |
8437577, | Mar 05 2009 | FORTRESS CREDIT CORP , AS AGENT | Methods and systems for image registration |
Patent | Priority | Assignee | Title |
5642294, | Dec 17 1993 | Nippon Telegraph and Telephone Corporation | Method and apparatus for video cut detection |
5835163, | Dec 21 1995 | Siemens Corporation | Apparatus for detecting a cut in a video |
6141486, | Jan 13 1993 | Hitachi America, Ltd. | Methods and apparatus for recording digital data including sync block and track number information for use during trick play operation |
6377297, | Dec 07 1999 | PROJECT GIANTS, LLC | Detection of repeated and frozen frames in a video signal |
6906743, | Jan 13 1999 | Tektronix, Inc. | Detecting content based defects in a video stream |
7110454, | Dec 21 1999 | Siemens Medical Solutions USA, Inc | Integrated method for scene change detection |
7417690, | Mar 12 1998 | HISENSE VISUAL TECHNOLOGY CO , LTD | Video processing system and video processing method |
20030206228, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 05 2009 | Tektronix, Inc. | (assignment on the face of the patent) | / | |||
Mar 05 2009 | FERGUSON, KEVIN M | Tektronix, Inc | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 048727 | /0915 | |
Jul 19 2019 | Tektronix, Inc | PROJECT GIANTS, LLC | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 049870 | /0073 | |
Jul 20 2019 | PROJECT GIANTS, LLC | Silicon Valley Bank | PATENT SECURITY AGREEMENT | 049819 | /0702 | |
Oct 15 2020 | PROJECT GIANTS, LLC | FORTRESS CREDIT CORP , AS AGENT | CORRECTIVE ASSIGNMENT TO CORRECT THE MISSING PROPERTY AND GRANTOR NAME OF PROJECT GIANTS LLC PREVIOUSLY RECORDED AT REEL: 054089 FRAME: 0786 ASSIGNOR S HEREBY CONFIRMS THE SECURITY INTEREST | 056442 | /0845 | |
Oct 15 2020 | PROJECT GIANT, LLC | FORTRESS CREDIT CORP , AS AGENT | SECURITY INTEREST SEE DOCUMENT FOR DETAILS | 054089 | /0786 | |
Oct 15 2020 | Silicon Valley Bank | PROJECT GIANTS, LLC | RELEASE BY SECURED PARTY SEE DOCUMENT FOR DETAILS | 054090 | /0934 |
Date | Maintenance Fee Events |
Nov 23 2015 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Nov 22 2019 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Nov 22 2023 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
May 22 2015 | 4 years fee payment window open |
Nov 22 2015 | 6 months grace period start (w surcharge) |
May 22 2016 | patent expiry (for year 4) |
May 22 2018 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 22 2019 | 8 years fee payment window open |
Nov 22 2019 | 6 months grace period start (w surcharge) |
May 22 2020 | patent expiry (for year 8) |
May 22 2022 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 22 2023 | 12 years fee payment window open |
Nov 22 2023 | 6 months grace period start (w surcharge) |
May 22 2024 | patent expiry (for year 12) |
May 22 2026 | 2 years to revive unintentionally abandoned end. (for year 12) |