portions that identify motion are identified in an original frame of interlaced video fields. A de-interlaced version of the frame is generated. The original frame and the de-interlaced frame are combined to form a resulting frame, the resulting frame including portions from the original frame that represent relatively less motion and portions from the de-interlaced version that represent relatively more motion.
|
6. A method comprising
identifying, in an original frame of interlaced video pixels, pixels that represent motion in the frame, by comparing corresponding pixels in different color channels of at least two frames, generating a de-interlaced version of the frame, combining the original frame and the de-interlaced frame to form a resulting frame, the resulting frame including pixels from the original frame that represent relatively less motion, pixels from the de-interlaced frame that represent relatively more motion, and pixels that are weighted averages of pixels in two fields of the frame.
1. A method comprising identifying portions of an original frame of interlaced video fields, the portions comprising pixels representing motion in the original frame, the portions representing motion being identified by comparing corresponding portions of at least two frames,
generating a de-interlaced version of the original frame, and combining the original frame and the de-interlaced version to form a resulting frame, the resulting frame including portions from the original frame that represent relatively less motion, and portions from the de-interlaced version that represent relatively more motion.
7. machine-readable code stored on a medium, the code being capable of configuring a machine to
identify portions of an original frame of interlaced video fields, portions comprising pixels and representing motion in the original frame, the portions representing motion being identified by comparing corresponding portions of at least two frames, generate a de-interlaced version of the original frame, and combine the original frame and the de-interlaced version to form a resulting frame, the resulting frame including portions from the original frame that represent relatively less motion, and portions from the de-interlaced version that represent relatively more motion.
2. The method of
3. The method of
4. The method of
5. The method of
|
This invention relates to video field artifact removal.
The NTSC video standard represents moving pictures using 30 individual frames per second. Each frame consists of two separate fields, which are sampled at different points in time. One field is displayed as the even scan-lines of a frame, and the other field is displayed as the odd scan-lines of the frame. If video is displayed at the intended rate, this allows a balance in the tradeoff between smooth motion and image resolution quality. However, if a single frame of video is viewed as a still, or if the format or frame-rate is changed, such as when video material is copied to film, these field artifacts can become visible and are undesirable.
Field artifacts can typically be removed by blurring the image vertically or averaging the scan-lines of one field to replace the other. These techniques can successfully eliminate field artifacts, but the effective resolution of the entire image is also reduced.
The invention provides a process for adaptively removing field artifacts from digital video material. The process preserves the resolution quality of stationary areas of frames, while removing field artifacts where they exist.
Thus, in general, the invention features:
identifying, in an original frame of interlaced video fields, portions that represent motion in the frame,
generating a de-interlaced version of the frame, and
combining the original frame and the de-interlaced frame to form a resulting frame, the resulting frame including portions from the original frame that represent relatively less motion, and portions from the de-interlaced version that represent relatively more motion.
Implementations of the invention may include one or more of the following features: The portions of the fields and frames may be pixels. The portions that represent motion in the frame may be identified by comparing corresponding portions of at least two frames. The corresponding portions may be compared separately in different color channels. also including adjusting the results of the identifying step based on at least one of the following parameters: scale, threshold, or blur. Some portions of the de-interlaced version of the frame may be generated by a weighted averaging of portions of two fields of the frame. The de-interlaced frame may be generated in conjunction with a selected frame rate change.
Other advantages and features will become apparent from the following description and from the claims.
The claim of this patent contains at least one drawing executed in color.
To adaptively remove field artifacts from a sequence of video frames, e.g., digital video frames, each frame in the sequence is processed as follows (see the flow chart in FIG. 4):
Step 1
Locate the areas of the frame that contain motion to generate a MotionMatte image 20. This is done by comparing the current frame with the previous and next frames in the clip. For example in
This is performed for the red, green, and blue channels independently, and then the maximum value of each pixel among the three channels is used to create a monochrome MotionMatte image.
The MotionMatte is then modified 22 using several parameters that can be adjusted if necessary by a user:
Scale Motion Matte: The MotionMatte is brightened overall by a given amount. A factor of 20 is typically used for this though other factors in the range of 1 to 100 may also be used. The effect of the scaling is to adjust the amount of field artifacts that are removed. The factor can be increased to remove more field artifacts or decreased to remove fewer and keep the image sharper.
Threshold Motion Matte: This value is subtracted from the Motion Matte and can be increased to reduce unwanted de-interlacing due just to noise. A value of 0.05 (or 5%) is typically used for this though other factors in the range 0 to 1 may also be used.
Blur Motion Matte: Determines how much the Motion Matte is smoothed out to avoid sharp transitions between the interlaced and de-interlaced areas. This value is typically set for a small amount of blur.
Finally the pixel values of the MotionMatte are clamped so none are below 0 (black) or above 1.0 (white).
Step 2
Create a de-interlaced version of the entire original frame. This can be done using one of three techniques:
A. The first field (even scan-lines) is kept and the second field (odd scan-lines) is removed. Each pixel in the field to be removed is replaced by the average of the pixel above and the pixel below (which belong to the remaining field).
B. The second field is kept and the first field is removed, using the analogous method as in A.
C. The two fields are de-interlaced and merged together. This is done by averaging the results of both A and B above.
Note that with any of these techniques alone, the resolution quality of the frame has been reduced due to the vertical averaging.
Step 3
The original frame, and the de-interlaced frame from Step 2 are combined 24 using the MotionMatte from Step 1, to give the result for that frame with the field artifacts removed. Pixels of the de-interlaced frame are used in the final frame in locations where the pixels of the MotionMatte are white (1.0). Pixels of the original frame are used where the MotionMatte is black (0). A weighted average of the two pixel values is used where the MotionMatte is gray. Thus, for each [x,y] pixel coordinate:
Speed Change Options
This procedure may be combined with an optional frame-rate conversion process that automatically calculates which fields are kept for each frame while de-interlacing in Step 2 above.
If the user elects to generate a `field-removed` result at half-speed, each original frame generates two resulting frames, the first keeping field1 and the second keeping field2 in the de-interlace step.
If the user wants to convert from 30 video frames per second to 24 `field-removed` frames per second, the procedure calculates the appropriate pattern of which fields to extract from each original frame to give the smoothest resulting motion, such as: 1st, 1st, 2nd, 2nd, none, etc.
User Interface
In addition to user interface elements that enable a user to set the scale, threshold, and blur values, the user interface can include two popup menus. One popup menu enables the user to choose an output option between "result", which outputs the de-interlaced result normally and: "MotionMatte", which allows viewing the MotionMatte, and can be helpful when adjusting the other parameters. The other pop-up menu enables the user to select which field to preserve in areas with field artifacts. The user may select either field or a merger of the two. An additional control enables the user to select changes to the frame-rate, such as "same speed", "half speed de-interlaced", or "NTSC to film" (30 fps to 24 fps).
Other embodiments are within then scope of the following claims.
Patent | Priority | Assignee | Title |
10176639, | Nov 27 2014 | CITIBANK, N A | Virtual/augmented reality system having dynamic region resolution |
7982800, | Feb 02 2007 | SHENZHEN XINGUODU TECHNOLOGY CO , LTD | Video de-interlacer using motion residue compensation |
8624980, | Aug 10 2004 | INTERDIGITAL MADISON PATENT HOLDINGS | Apparatus and method for indicating the detected degree of motion in video |
Patent | Priority | Assignee | Title |
5579054, | Apr 21 1995 | KODAK ALARIS INC | System and method for creating high-quality stills from interlaced video |
5784115, | Dec 31 1996 | Xerox Corporation | System and method for motion compensated de-interlacing of video frames |
6262773, | Sep 15 1997 | RAKUTEN, INC | System for conversion of interlaced video to progressive video using edge correlation |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Date | Maintenance Fee Events |
Nov 13 2007 | M2551: Payment of Maintenance Fee, 4th Yr, Small Entity. |
Nov 19 2007 | REM: Maintenance Fee Reminder Mailed. |
Dec 26 2011 | REM: Maintenance Fee Reminder Mailed. |
Dec 28 2011 | ASPN: Payor Number Assigned. |
May 11 2012 | EXP: Patent Expired for Failure to Pay Maintenance Fees. |
Date | Maintenance Schedule |
May 11 2007 | 4 years fee payment window open |
Nov 11 2007 | 6 months grace period start (w surcharge) |
May 11 2008 | patent expiry (for year 4) |
May 11 2010 | 2 years to revive unintentionally abandoned end. (for year 4) |
May 11 2011 | 8 years fee payment window open |
Nov 11 2011 | 6 months grace period start (w surcharge) |
May 11 2012 | patent expiry (for year 8) |
May 11 2014 | 2 years to revive unintentionally abandoned end. (for year 8) |
May 11 2015 | 12 years fee payment window open |
Nov 11 2015 | 6 months grace period start (w surcharge) |
May 11 2016 | patent expiry (for year 12) |
May 11 2018 | 2 years to revive unintentionally abandoned end. (for year 12) |