Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
22nd July 2012, 16:13 | #1 | Link | |
Registered User
Join Date: Jan 2006
Posts: 1,867
|
Fuzzy Match Noisy Clips - for Median Multicap
Background
There is a popular technique for capturing noisy VHS where you capture the same tape multiple times, then apply a median across captures. In order for this to work, they must be in perfect sync. It's difficult to avoid dropped frames however, thus we need a way to automatically sync the captures. In this case, the material is by definition noisy which makes frame-matching difficult. The Problem The objective is to determine where 2 or more clips become out of sync due to dropped frames in either. Then, to replace the lost frames with another copy to bring the clips in sync. No interpolation is necessary. The median will fail for this frame when there's 3 captures, because there's only two real frames to process. We only need to search +- 1 frame. Related Problems Removing dupes in 18fps film capture, finding censored scenes in one version of a movie, interpolating missing frames, shutterless film projector capture, lining up the start of two movies, fixing bad frames, finding drops with only one reference clip. Previous Work Matching Functions: I've dismissed YDifferenceFromPrevious because the clips are too noisy. I tried various filters like dedupe with little success. I got some promising results with an early correlation script. Other attempts: Other people have scripts to find dropped frames, but my torture test case is extremely noisy. Findframes, ClipClop, and my corr plugin have all been recent advancements preparing for this work. Drop patterns: using my Glitch Analyzer script, and testing settings in Virtualdub, also some examples on forums. The Approach Instead of relying on an exact match/mismatch answer per frame, I am looking for reliable, definite answers about some frames, leaving the status of the others unknown. This lets me detect an out-of-sync condition eventually, but I can't say exactly when it started. It would be saved to a log to fix by hand, or else you would put up with ghosting in the resulting median which lasts for a few frames. Training the Match Function I've written a short test to fuzzy match really noisy clips. -Pick a matching function; this function should give a similarity measure. -Handpick two clips that are in sync. -Find a matching coefficient range for this condition. -Put the clips one frame out of sync (representing a dropped frame), and find a new range. The out of sync clips should have a lower match coefficient than the sync clips (assuming there's motion). My comparison function was a correlation (with my corr plugin). A correlation approaches 1 as the images become similar. For a high motion clip (ideal case), these are my results: Quote:
The only way to say they definitely don't match, is if r<.735454. Out of 31 frames, I found 7 that match in the matching clips, and 9 that don't match, in the out of sync clips. The rest are considered unknown. It turns out I can tell if they are out of sync within 6 frames, in the given test clip. This has to do with how many unknowns in a row there are. Patterns of Dropped Frames 1st type: Dupe, normal frame, dropped frame, and the glitch appears every 13-15 frames. Example: 1, 2, 2, 3, 5, 6... If this pattern holds, I can find the duped frame easily enough because it's a digital exact copy, in the case of a capture card, or a very close lumadiff, in the case of a capture of a TBC which is hiccupping. I've also seen a dupe, drop pattern. Example: 1, 2, 2, 4, 5... 2nd type: Random frames just go missing. Example: 1, 3, 4... In that case the clips should go out of sync permanently. I should be able to detect this within, as I say 6 frames, for this particular test clip. I guess that covers it. It should be possible to sync any clips. Alternative Measurement Approach I could do this another way. I could test if each frame is the same frame in the 2nd clip or the previous frame in the 2nd clip better. If it matches the previous, then the current frame is missing in the 2nd clip. I think this will give a definite answer more frequently. Results I should also report the low-motion training case. It's much worse. I end up with long stretches where I can only find matches but not mismatches. The training will be critical, and there's no guarantee the result is always definite. I should set the thresholds in between the lowest no match and lowest match ranges. Questions to Explore -How to optimize pre-filtering to increase matching reliability -Testing performance on truly unknown videos, was the training good enough to work in the longer term? -Relatively finding dropped frames amongst >2 clips -Automatically inserting missing frames from one of the other clips (just enough to make a median work) -Optimizing the comparison function -Measuring statistics of the matching, longest string of unknowns, % known, low/high motion areas, how often/many frames get dropped The Lastest Test Script(s) A package for you to experiment is here http://www.sendspace.com/file/mr0uss Code:
#line-up by jmac #0.1 #an experimental script to eventually sync noisy videos of the same source with relatively dropped frames #this test used improperly capped passes, with random inserted and skipped frames #requirements #corr by jmac #medianblur #masktools v2 #The script is for experimentation with frame syncing. Please disable ShowMedianCleaned first. global rmin=1. global rmax=-1. #not used yet global rminmatch=1. global rmaxmatch=-1. global rminnomatch=1. global rmaxnomatch=-1. #Adjust this to your work directory dir="C:\project001\multicap resync\" fn="short" #Load a set of numbered clips n=1 v1=avisource(dir+fn+String(n, "%02.0f")+".avi") n=n+1 v2=avisource(dir+fn+String(n, "%02.0f")+".avi") n=n+1 v3=avisource(dir+fn+String(n, "%02.0f")+".avi") n=n+1 v4=avisource(dir+fn+String(n, "%02.0f")+".avi") n=n+1 v5=avisource(dir+fn+String(n, "%02.0f")+".avi") #Set the colorspace, or pre-filter the videos v1=v1.common v2=v2.common v3=v3.common v4=v4.common v5=v5.common #Label the videos v1=v1.subtitle("Capture 1").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""") v2=v2.subtitle("Capture 2").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""") v3=v3.subtitle("Capture 3").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""") v4=v4.subtitle("Capture 4").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""") v5=v5.subtitle("Capture 5").ScriptClip("""subtitle("frame = " + string(current_frame),x=-1)""") #Make copies in a global context, just to make things easier later Global v1=v1 Global v2=v2 Global v3=v3 Global v4=v4 Global v5=v5 #Show how wonderful the multicap technique works when the captures are lined up! ShowMedianCleaned#You should disable this line in order to do your own frame matching experiments. This video is NOT the point of the script! #return last #Sync the videos. For now, just print some statistics #To use this part, seek to frame 0, hit refresh, scroll through 1 frame at a time until frame10, then write down rmin/rmax. These are the values for match condition. #Seek to frame 11, hit refresh (resets rmin/rmax), then scroll to the end, then write down rmin/rmax. These are the values for mismatch condition. #Now place the correct thresholds in FindThresh. Now scroll through the whole video and count the frames detected as match, no match, and unknown #For version 0.2, automate that process! findthresh(v1,v2)#Find range for matching clips rminmatch=rmin rmaxmatch=rmax last+findthresh(v1,v2.trim(1,0)).trim(0,9)#Find range for not matching clips rminnomatch=rmin rmaxnomatch=rmax return last#+subtitle("match="+string(rminmatch)+"-"+string(rmaxmatch)+", nomatch="+string(rminnomatch)+"-"+string(rmaxnomatch),y=64) return help(v1,v2) return subtract(v1,v2).ScriptClip(""" r=jcorr(v1,v2) subtitle("r = " + string(r),y=-1) """) return last #sync helper function help(clip a,clip b) { #A helper function to visually compare two clips. It shows the top half of each clip, followed by their blend. #This arrangement was designed to show if the clips are in sync. stackvertical(a.crop(0,0,0,-360),b.crop(0,0,0,-360),overlay(a,b,opacity=.5,mode="blend").crop(0,0,0,-360)) } function common(clip a) { a PreFilter } function FindThresh(clip a, clip b) { #Using a known synced clip, find fuzzy comparison values #You are meant to play through this video once (for each condition), after which the trained thresholds would be set. #You could make it a first pass or prepend it to your video during testing stackhorizontal(a,b)#to pass two clips, or could use GRunT ScriptClip(""" a=crop(0,0,-last.width/2,0)#Extract the left clip b=crop(last.width/2,0,0,0)#Extract the right clip r=jcorr(a, b)#Measure similarity between the two clips rmin=r<rmin?r:rmin rmax=r>rmax?r:rmax subtitle("Finding Thresholds",y=16) subtitle("r="+string(r)+", rmin="+string(rmin)+", rmax="+string(rmax),y=32) #slight problem below - r>.906036 is evaluating as true when r=.906036 status=r<.888354?"no match":"unknown" status=r>.906037?"match":status subtitle(status,y=100) """) } function TemporalMedian(clip a, int n) { #Perform a temporal median of n clips. #The clips must be prepared with this command: interleave(clip1, clip2, ... clipn) a.MedianBlurT(0,0,0,1)#From MedianBlur dll SelectEvery(n,1) } function ShowMedianCleaned { #This is just to show the ideal result we are aiming for Interleave(v1, v2, v3, v4, v5) TemporalMedian(5).subtitle("Median Cleaned version with 5 captures",y=120) stackhorizontal(last,v1) } function PreFilter(clip a) { #Experiment with different pre-filtering options here mt_luts( a, a, mode = "med", pixels = mt_rectangle( 1,1), expr = "y" )#Perform a depulse pre-filtering } Very good explanation of the use of median for noise reduction http://dmr.ath.cx/gfx/median/ Description, and another rediscovery, of the multiple capture technique http://forum.doom9.org/archive/index.php/t-150200.html Dropped frames in Virtualdub http://forums.virtualdub.org/index.p...ropped+frames& Inserted frames in Virtualdub http://forums.virtualdub.org/index.p...ropped+frames& Recognition of the Sync problem in relation to Median Technique http://forum.doom9.org/archive/index.php/t-163958.html Find frames missing in one video but not the other http://forum.videohelp.com/threads/3...66#post1912566 Find matching frame/clip in another clip http://forum.doom9.org/showthread.ph...33#post1258933 Test a Capture System for dupes/drops with Glitch Analyzer http://forum.doom9.org/showthread.php?p=1462931 FillDropsI - Interpolates Dropped Frames http://forum.videohelp.com/threads/3...=1#post2077809 http://forum.doom9.org/showthread.php?t=160623 Last edited by jmac698; 23rd July 2012 at 16:30. Reason: Improved writing |
|
23rd July 2012, 00:35 | #2 | Link |
Registered User
Join Date: Jan 2006
Posts: 1,867
|
It's been suggested to blur the clips for detection. Some things I should point out about correlation; it's unaffected by brightness, contrast, or rearranging pixels, so it's already a pretty good comparison function.
Pulses are the addition of white lines. This is the addition of brightness. If there's the same amount of pulses in both clips, it won't affect the correlation. If the pulses cover up some unique pixels, it reduces the correlation. I don't have much hope for blurring either, because that just spreads the energy around, it doesn't change it. But I can try it anyway. |
23rd July 2012, 05:31 | #3 | Link |
Registered User
Join Date: Feb 2004
Posts: 1,348
|
You will probably have to use a 'real' denoiser. The fastest that might work is some sort of median filter. Obviously temporal processing should be avoided. Other then that you really just need to try tons of stuff until you find something that works (there is always something), if you really need it to improve it, that is.
|
23rd July 2012, 05:46 | #4 | Link | |
Registered User
Join Date: Jan 2006
Posts: 1,867
|
Step 1: How to optimize pre-filtering to increase matching reliability
Using mp4's suggestion, I added depulse pre-filtering Code:
function common(clip a) { mt_luts( a, a, mode = "med", pixels = mt_rectangle( 1,1 ), expr = "y" ) } Quote:
I've also verified my thoughts on the correlation; by using tweak(bright=-10,coring=false) on one clip, the correlation didn't change. Update Code:
Results summary Filtering type Matches Mismatches MatchRun MismatchRun none 7 9 rect(1,1) 5 20 17 6 rect(1,3) 3 20 Last edited by jmac698; 23rd July 2012 at 13:42. |
|
23rd July 2012, 16:19 | #5 | Link |
Registered User
Join Date: Jan 2006
Posts: 1,867
|
I've posted a package to experiment with this, http://www.sendspace.com/file/mr0uss (but see the first message for updated links). It's using different clips than the results reported above. This is actually an hour away from doing something useful.
I know there are several people who will be interested in this development. If you can make any discoveries, let me know. Last edited by jmac698; 23rd July 2012 at 16:36. |
Thread Tools | Search this Thread |
Display Modes | |
|
|