Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.


Go Back   Doom9's Forum > Capturing and Editing Video > Avisynth Usage

Thread Tools Search this Thread Display Modes
Old 22nd July 2012, 17:13   #1  |  Link
Registered User
Join Date: Jan 2006
Posts: 1,867
Fuzzy Match Noisy Clips - for Median Multicap

There is a popular technique for capturing noisy VHS where you capture the same tape multiple times, then apply a median across captures. In order for this to work, they must be in perfect sync. It's difficult to avoid dropped frames however, thus we need a way to automatically sync the captures. In this case, the material is by definition noisy which makes frame-matching difficult.

The Problem
The objective is to determine where 2 or more clips become out of sync due to dropped frames in either. Then, to replace the lost frames with another copy to bring the clips in sync. No interpolation is necessary. The median will fail for this frame when there's 3 captures, because there's only two real frames to process. We only need to search +- 1 frame.

Related Problems
Removing dupes in 18fps film capture, finding censored scenes in one version of a movie, interpolating missing frames, shutterless film projector capture, lining up the start of two movies, fixing bad frames, finding drops with only one reference clip.

Previous Work
Matching Functions: I've dismissed YDifferenceFromPrevious because the clips are too noisy. I tried various filters like dedupe with little success. I got some promising results with an early correlation script. Other attempts: Other people have scripts to find dropped frames, but my torture test case is extremely noisy. Findframes, ClipClop, and my corr plugin have all been recent advancements preparing for this work. Drop patterns: using my Glitch Analyzer script, and testing settings in Virtualdub, also some examples on forums.

The Approach
Instead of relying on an exact match/mismatch answer per frame, I am looking for reliable, definite answers about some frames, leaving the status of the others unknown. This lets me detect an out-of-sync condition eventually, but I can't say exactly when it started. It would be saved to a log to fix by hand, or else you would put up with ghosting in the resulting median which lasts for a few frames.

Training the Match Function
I've written a short test to fuzzy match really noisy clips.
-Pick a matching function; this function should give a similarity measure.
-Handpick two clips that are in sync.
-Find a matching coefficient range for this condition.
-Put the clips one frame out of sync (representing a dropped frame), and find a new range.
The out of sync clips should have a lower match coefficient than the sync clips (assuming there's motion).
My comparison function was a correlation (with my corr plugin). A correlation approaches 1 as the images become similar.
For a high motion clip (ideal case), these are my results:
clips really match
.735454 - .808188

clips really have no match
.720647 - .791179
So the only way to say they definitely match, is if r>.791179.
The only way to say they definitely don't match, is if r<.735454.
Out of 31 frames, I found 7 that match in the matching clips, and 9 that don't match, in the out of sync clips. The rest are considered unknown.

It turns out I can tell if they are out of sync within 6 frames, in the given test clip. This has to do with how many unknowns in a row there are.

Patterns of Dropped Frames
1st type:
Dupe, normal frame, dropped frame, and the glitch appears every 13-15 frames. Example: 1, 2, 2, 3, 5, 6... If this pattern holds, I can find the duped frame easily enough because it's a digital exact copy, in the case of a capture card, or a very close lumadiff, in the case of a capture of a TBC which is hiccupping. I've also seen a dupe, drop pattern. Example: 1, 2, 2, 4, 5...

2nd type:
Random frames just go missing. Example: 1, 3, 4... In that case the clips should go out of sync permanently. I should be able to detect this within, as I say 6 frames, for this particular test clip.

I guess that covers it. It should be possible to sync any clips.

Alternative Measurement Approach
I could do this another way. I could test if each frame is the same frame in the 2nd clip or the previous frame in the 2nd clip better. If it matches the previous, then the current frame is missing in the 2nd clip. I think this will give a definite answer more frequently.

I should also report the low-motion training case. It's much worse. I end up with long stretches where I can only find matches but not mismatches. The training will be critical, and there's no guarantee the result is always definite. I should set the thresholds in between the lowest no match and lowest match ranges.

Questions to Explore
-How to optimize pre-filtering to increase matching reliability
-Testing performance on truly unknown videos, was the training good enough to work in the longer term?
-Relatively finding dropped frames amongst >2 clips
-Automatically inserting missing frames from one of the other clips (just enough to make a median work)
-Optimizing the comparison function
-Measuring statistics of the matching, longest string of unknowns, % known, low/high motion areas, how often/many frames get dropped

The Lastest Test Script(s)
A package for you to experiment is here http://www.sendspace.com/file/mr0uss
#line-up by jmac
#an experimental script to eventually sync noisy videos of the same source with relatively dropped frames
#this test used improperly capped passes, with random inserted and skipped frames

#corr by jmac
#masktools v2

#The script is for experimentation with frame syncing.  Please disable ShowMedianCleaned first.

global rmin=1.
global rmax=-1.

#not used yet
global rminmatch=1.
global rmaxmatch=-1.
global rminnomatch=1.
global rmaxnomatch=-1.

#Adjust this to your work directory
dir="C:\project001\multicap resync\"

#Load a set of numbered clips
v1=avisource(dir+fn+String(n, "%02.0f")+".avi")
v2=avisource(dir+fn+String(n, "%02.0f")+".avi")
v3=avisource(dir+fn+String(n, "%02.0f")+".avi")
v4=avisource(dir+fn+String(n, "%02.0f")+".avi")
v5=avisource(dir+fn+String(n, "%02.0f")+".avi")

#Set the colorspace, or pre-filter the videos

#Label the videos
v1=v1.subtitle("Capture 1").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""")
v2=v2.subtitle("Capture 2").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""")
v3=v3.subtitle("Capture 3").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""")
v4=v4.subtitle("Capture 4").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""")
v5=v5.subtitle("Capture 5").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""")

#Make copies in a global context, just to make things easier later
Global v1=v1
Global v2=v2
Global v3=v3
Global v4=v4
Global v5=v5
#Show how wonderful the multicap technique works when the captures are lined up!
ShowMedianCleaned#You should disable this line in order to do your own frame matching experiments.  This video is NOT the point of the script!
#return last
#Sync the videos.  For now, just print some statistics
#To use this part, seek to frame 0, hit refresh, scroll through 1 frame at a time until frame10, then write down rmin/rmax.  These are the values for match condition.
#Seek to frame 11, hit refresh (resets rmin/rmax), then scroll to the end, then write down rmin/rmax.  These are the values for mismatch condition.
#Now place the correct thresholds in FindThresh.  Now scroll through the whole video and count the frames detected as match, no match, and unknown
#For version 0.2, automate that process!
findthresh(v1,v2)#Find range for matching clips
last+findthresh(v1,v2.trim(1,0)).trim(0,9)#Find range for not matching clips
return last#+subtitle("match="+string(rminmatch)+"-"+string(rmaxmatch)+", nomatch="+string(rminnomatch)+"-"+string(rmaxnomatch),y=64)
return help(v1,v2)
return subtract(v1,v2).ScriptClip("""
subtitle("r = " + string(r),y=-1)

return last

#sync helper
function help(clip a,clip b) {
    #A helper function to visually compare two clips.  It shows the top half of each clip, followed by their blend.  
    #This arrangement was designed to show if the clips are in sync.

function common(clip a) {

function FindThresh(clip a, clip b) {
    #Using a known synced clip, find fuzzy comparison values
    #You are meant to play through this video once (for each condition), after which the trained thresholds would be set.
    #You could make it a first pass or prepend it to your video during testing
    stackhorizontal(a,b)#to pass two clips, or could use GRunT
    a=crop(0,0,-last.width/2,0)#Extract the left clip
    b=crop(last.width/2,0,0,0)#Extract the right clip
    r=jcorr(a, b)#Measure similarity between the two clips
    subtitle("Finding Thresholds",y=16)
    subtitle("r="+string(r)+", rmin="+string(rmin)+", rmax="+string(rmax),y=32)
    #slight problem below - r>.906036 is evaluating as true when r=.906036
    status=r<.888354?"no match":"unknown"

function TemporalMedian(clip a, int n) {
    #Perform a temporal median of n clips.
    #The clips must be prepared with this command: interleave(clip1, clip2, ... clipn)
    a.MedianBlurT(0,0,0,1)#From MedianBlur dll

function ShowMedianCleaned {
    #This is just to show the ideal result we are aiming for
    Interleave(v1, v2, v3, v4, v5)
    TemporalMedian(5).subtitle("Median Cleaned version with 5 captures",y=120)

function PreFilter(clip a) {
    #Experiment with different pre-filtering options here
    mt_luts( a, a, mode = "med", pixels = mt_rectangle( 1,1), expr = "y" )#Perform a depulse pre-filtering
Very good explanation of the use of median for noise reduction

Description, and another rediscovery, of the multiple capture technique

Dropped frames in Virtualdub
Inserted frames in Virtualdub

Recognition of the Sync problem in relation to Median Technique

Find frames missing in one video but not the other

Find matching frame/clip in another clip

Test a Capture System for dupes/drops with Glitch Analyzer

FillDropsI - Interpolates Dropped Frames

Last edited by jmac698; 23rd July 2012 at 17:30. Reason: Improved writing
jmac698 is offline   Reply With Quote
Old 23rd July 2012, 01:35   #2  |  Link
Registered User
Join Date: Jan 2006
Posts: 1,867
It's been suggested to blur the clips for detection. Some things I should point out about correlation; it's unaffected by brightness, contrast, or rearranging pixels, so it's already a pretty good comparison function.
Pulses are the addition of white lines. This is the addition of brightness. If there's the same amount of pulses in both clips, it won't affect the correlation. If the pulses cover up some unique pixels, it reduces the correlation.
I don't have much hope for blurring either, because that just spreads the energy around, it doesn't change it.
But I can try it anyway.
jmac698 is offline   Reply With Quote
Old 23rd July 2012, 06:31   #3  |  Link
*.mp4 guy
Registered User
*.mp4 guy's Avatar
Join Date: Feb 2004
Posts: 1,350
You will probably have to use a 'real' denoiser. The fastest that might work is some sort of median filter. Obviously temporal processing should be avoided. Other then that you really just need to try tons of stuff until you find something that works (there is always something), if you really need it to improve it, that is.
*.mp4 guy is offline   Reply With Quote
Old 23rd July 2012, 06:46   #4  |  Link
Registered User
Join Date: Jan 2006
Posts: 1,867
Step 1: How to optimize pre-filtering to increase matching reliability

Using mp4's suggestion, I added depulse pre-filtering
function common(clip a) {
    mt_luts( a, a, mode = "med", pixels = mt_rectangle( 1,1 ), expr = "y" )
And the results
.895015 - .936733

no match
.878057 - .933062

5 true matches, 20 true mismatches
Out of 31 frames. There's some very long gaps between known anchors, but if there's a dropped frame I should know pretty quickly, however where it actually started will be within a large range.

I've also verified my thoughts on the correlation; by using tweak(bright=-10,coring=false) on one clip, the correlation didn't change.

Results summary
Filtering type  Matches  Mismatches MatchRun MismatchRun
none            7         9
rect(1,1)       5        20         17       6
rect(1,3)       3        20
Interpretation: the higher the numbers in the first two columns, the better. Matches means the number of matches detected that were truly matches, likewise for mismatches. MatchRun means the longest run of unknown status during a matched portion, likewise for a mismatched portion. What this tells us is the size of the clip where you don't know where the dropped frame occured. We want to reduce this number. The worst case is the sum of these two; up to 23 frames before we notice a dropped frame.

Last edited by jmac698; 23rd July 2012 at 14:42.
jmac698 is offline   Reply With Quote
Old 23rd July 2012, 17:19   #5  |  Link
Registered User
Join Date: Jan 2006
Posts: 1,867
I've posted a package to experiment with this, http://www.sendspace.com/file/mr0uss (but see the first message for updated links). It's using different clips than the results reported above. This is actually an hour away from doing something useful.

I know there are several people who will be interested in this development. If you can make any discoveries, let me know.

Last edited by jmac698; 23rd July 2012 at 17:36.
jmac698 is offline   Reply With Quote

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +1. The time now is 06:58.

Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.