Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Capturing and Editing Video > Avisynth Development

Reply
 
Thread Tools Search this Thread Display Modes
Old 2nd May 2003, 04:29   #21  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Here's an alpha of MultiDecimate()

Attached herein for your decimating pleasure please find MultiDecimate() 1.0.0 beta 1. Please read the text help file included in the distribution for limitations and future plans. This is a first release to prove the concept and get user feedback.

[EDIT: Alpha withdrawn as I may have found a bug. Please stay tuned for a fixed release.]

Last edited by Guest; 2nd May 2003 at 06:07.
Guest is offline   Reply With Quote
Old 3rd May 2003, 15:41   #22  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
not a bug, but worse

Well, there was no bug; the program was operating as designed.

The problem was that the random access algorithm was not fully correct. I've thought more and concluded that there is no correct random access algorithm that doesn't require as much scanning as a 2-pass solution.

Therefore, I have implemented the 2-pass solution and tested it locally. Note that the second pass is very fast as there is no calculation involved or frame processing involved. 'manono' will be doing some additional testing and then I'll release it here. It seems to work well on his funny-money silent films that have weird duplicate patterns, like 43 out of every 143 frames!

The use of two passes allows for the insertion of a (GUI) application that permits manual tweaking of the decimation decisions, if desired. It also allows for totally random decimation, such as "remove all combed frames in the clip" (because the correct frame count is known before the second pass begins).

Last edited by Guest; 3rd May 2003 at 15:48.
Guest is offline   Reply With Quote
Old 4th May 2003, 21:01   #23  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Here's a test version

OK, here is a test version. It does n-out-of-m decimation with a maximum cycle size of 250. 'manono' reports that it is working well on his silent movies.

http://shelob.mordor.net/dgraft/misc...ecimate102.zip

Instructions and limitations are in the help file, of course.

Later versions will add YV12 and additional decimation modes.
Guest is offline   Reply With Quote
Old 4th May 2003, 21:01   #24  |  Link
Dali Lama
Registered User
 
Join Date: Jan 2002
Posts: 331
Re: not a bug, but worse

Quote:
Originally posted by neuron2
It also allows for totally random decimation, such as "remove all combed frames in the clip" (because the correct frame count is known before the second pass begins).
Very interesting. I don't really understand how this may work. Can you elaborate neuron2? I think this may mean that those clips that have two nearly identical frames, but just slight combing may be removed better now? (such as mouth combing in anime?)

Perhaps I am wrong?

Dali
Dali Lama is offline   Reply With Quote
Old 4th May 2003, 22:55   #25  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
@Dali Lama

The first pass is used to gather metrics about the frames. Then the GUI application is used to translate from the metrics to the decimation list used by the second pass (i.e., a list relating requested frame numbers to the frame numbers to be returned). Currently, the metrics collected are just frame differences, and the GUI selects the most similar n out of m frames to decimate.

The metrics gathered could however be other things, such as amount of combing. That doesn't mean that the combing detection will be better than what we have already, however! But the two-pass mechanism allows for decimating all the frames detected as combed, which cannot be supported by the existing Decimate().

The GUI application can also be enhanced to allow totally random editing of the frame remove list, thereby allowing completely random decimation.

The basic idea is simply to allow decimation that doesn't fit the 1-in-n mold.
Guest is offline   Reply With Quote
Old 6th May 2003, 09:24   #26  |  Link
MrBunny
Registered User
 
Join Date: Oct 2002
Posts: 82
@neuron2

This might be a bit off topic, but since we were mentioning the topic of silents this is somewhat appropriate.

I've noticed in many of the silent movies that I've looked at, it's set up with a sequence of action, then a few seconds with some written dialog and then back to the action. What I've found is that during the cycle with a change from dialog to action, decimate will sometimes take a frame from the dialog instead of a duplicate from the action section. In the cases I've seen, the action duplicate frames aren't exactly duplicate (although you can tell that they are the same frame).

I'm not exactly sure how decimate works, but I was wondering if there was a possibility of adding a new mode to it that would do the following:
1. It checks for all frames that are deemed "duplicate" (according to a certain threshold) in the current cycle.
2. It then checks each of these frames to see if it is part of a sequence of duplicate frames (guided a certain metric like threshold2 and with a sequence length given by the user) and exempts those frames from being chosen to dropped.
3. If there are more pairs of duplicate frames, it chooses the one to drop as it normally would. If there are no pairs left then it drops the exempt frame.

After having read the description of the multidecimate algorithm, this still seems to be a potential problem when encoding silents or even any movie with a series of still frames. In fact, I think that it could be worse with multidecimate as I'm not sure how the multiple frames are chosen. It seems to me that it might be possible that all the dropped frames could come from dialog instead of the actual action sequences. Unfortunately I don't have any such material on me at the moment, so this is mostly theoretical (except for what I've observed before).

Thanks for your time as always,

Mr. B
MrBunny is offline   Reply With Quote
Old 6th May 2003, 16:41   #27  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
@MrBunny

Thank you for the report. I am well aware of this issue (it is mentioned in the MultiDecimate help file!) and am working to develop a solution.

Last edited by Guest; 6th May 2003 at 20:48.
Guest is offline   Reply With Quote
Old 6th May 2003, 20:50   #28  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
We don't want to just exempt the static scenes, because to do so may result in too much decimation of action frames. Probably we want to specify a certain run size for duplicates, such that runs of this size or greater are decimated at the defined ratio, and no more.
Guest is offline   Reply With Quote
Old 7th May 2003, 02:13   #29  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Here's my plan for the static scene problem. Tell me if you think it is not OK.

You will select a mode called "cycle-based: protect static scenes". The existing mode will be "cycle-based: naive". Later will be added "global: remove all duplicates".

To protect static scenes you'll configure the duplicate threshold, which is the metric below which dups are declared, and the dup run length, equal to and above which protection will be applied. Protection will consist of calculating how many dups there should be in the run based on the remove/cycle ratio, and then changing the metric on the others in the run to be very high, so that they don't get decimated early. I'll use a trick that will cause multiple runs to be equally decimated.

Later I will add preference for doubletons and then singletons, as that is necessary for clips like Princess Mononoke.

I will begin coding this tomorrow unless I hear objections by then.
Guest is offline   Reply With Quote
Old 7th May 2003, 07:40   #30  |  Link
MrBunny
Registered User
 
Join Date: Oct 2002
Posts: 82
@neuron2

It seems I musta missed that in the readme

The layout you suggested seems about right. The only question I have is the following. How can the calculations for the number of post decimate frames be calculated without considering the other frames within a cycle (in the edge condition)? For example, given a sequence of 11 duplicate frames, after a decimate(cycle=5), you can have anywhere from 10 to 8 duplicates left depending on the types of frames surrounding the dupe sequence.
To illustrate, given a sequence of duplicate frames (D), unique frames (U) and fieldmatched duplicate frames (T) and using | as cycle boundaries:

Situation 1 (10 dupes left):
TTDDD|DDDDD|DDDTT -> correctly decimated it should be TDDD|DDDD|DDDT

Situation 2 (9 dupes left):
UUDDD|DDDDD|DDDTT -> correctly decimated it should be UUDD|DDDD|DDDT

Situation 3 (8 dupes left):
UUDDD|DDDDD|DDDUU -> correctly decimated it should be UUDD|DDDD|DDUU

Situation 4 (Example with dupes starting cycle, 9 dupes left)
UUTTU|DDDDD|DDDDD|DUTTU -> correctly decimated it should be UUTU|DDDD|DDDD|DUTU

I hope it's kinda clear what I think the problem could be with processing just the duplicate frames first and not considering the other frames that aren't duplicate, but share a cycle with part of that sequence. It was for this reason that I'd originally thought to run decimate as normal, then check if that frame was part of a sequence and only use it if there wasn't another pairs of frames that could be considered duplicate and not within a sequence.
MrBunny is offline   Reply With Quote
Old 7th May 2003, 08:20   #31  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
@MrBunny

We're going a bit at cross purposes here. I am addressing MultiDecimate() where the large cycles can cause a problem for static sequences. I wasn't addressing these normal 1-in-5 cycles.

However, you do raise valid issues. The big problem is that we can't tell the difference between duplicates resulting from the 3:2 pulldown and those from a static scene. Besides, this never really arose as a practical problem when doing 3:2 decimation. There was a practical problem with animation that originated at 12fps, and that was solved with mode=2. But unless you produce a clip that really shows a practical problem based on your theoretical observations, I'm not going to get very excited about it, especially as doing anything about it will be extremely difficult.
Guest is offline   Reply With Quote
Old 7th May 2003, 19:00   #32  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
static scene protection implemented

MultiDecimate 1.0.3 supports static scene protection. Get it here:

http://shelob.mordor.net/dgraft/misc...ecimate103.zip
Guest is offline   Reply With Quote
Old 14th May 2003, 01:16   #33  |  Link
thalos
Registered User
 
thalos's Avatar
 
Join Date: Dec 2002
Location: Florida or New York
Posts: 2
@neuron2: thanks bunches for the filter... its the first solution i've found to getting 120fps encodes back to NTSC standards without creating too much jerkiness... i've been trying several ways and accidently stumbled upon it. My problem was that the source included runs of both 3 and 4 dupes one after the other. The result is still a little jerky... but i guess i'll live. Though on another note... i think i might have broken something when i tried the following:

Cycle=20 Remove=15 and Threshold=1.0 Run=15

Not sure what caused what... but the dfile began to grow rather large... it was about 4gb when i killed it. If you want i can post the mfile i used too...
__________________
- Thalos
When all you really feel like doing is putting on a funny hat.
thalos is offline   Reply With Quote
Old 14th May 2003, 02:26   #34  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Hi,

About that dfile growing. Get 1.0.5 from my web site. Sorry.

Any feedback you have or feature requests will be gratefully received.

If you can make available a clip that comes out jerky, I can probably fix it.

Thank you.
Guest is offline   Reply With Quote
Old 14th May 2003, 15:21   #35  |  Link
thalos
Registered User
 
thalos's Avatar
 
Join Date: Dec 2002
Location: Florida or New York
Posts: 2
thanks for the quick fix...

Here's the short clip of one of the worst problem scenes...
http://www.cudelts.org/misc/Gravion%...#91;Divx5].avi

Here's the full Opening sequence...
http://www.cudelts.org/misc/Gravion%...#91;Divx5].avi

I ended up using cycle=4 remove=3... which produced a reasonably non-shaky clip.

I think the problem pretty much is rather unique one to encodes at 120fps that combine both 24fps and 29fps encodes into a single source. Right now the sequence does just fine decimating the 29fps scenes... and the 24fps scenes creates AABCDEEFGH sequences... which creates small hiccups... The problem is really only that noticable during panning sequences... so... if you think you could make this even smoother... lemme know how you do it.
__________________
- Thalos
When all you really feel like doing is putting on a funny hat.
thalos is offline   Reply With Quote
Old 14th May 2003, 19:00   #36  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Quote:
Originally posted by thalos
I think the problem pretty much is rather unique one to encodes at 120fps that combine both 24fps and 29fps encodes into a single source.
Isn't that the whole reason for 120fps in the first place? Or is it also used for non-hybrid 24/30fps material?

Quote:
Right now the sequence does just fine decimating the 29fps scenes... and the 24fps scenes creates AABCDEEFGH sequences... which creates small hiccups... The problem is really only that noticable during panning sequences... so... if you think you could make this even smoother... lemme know how you do it.
The old hybrid clip problem. You could try this: Create your MultiDecimate'd sequence as above (you'll have AABCDEEFGH sequences). Then run that through Decimate(mode=1) with an appropriate threshold. That will convert C D E E F G to C D E E/F F G, where E/F is a blend. If you think this improves panning smoothness, we could contemplate adding similar functionality to MultiDecimate().
Guest is offline   Reply With Quote
Old 16th May 2003, 02:23   #37  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
Version 1.0.6 available at my web site

I have released version 1.0.6 at my web site. It adds global (non-cycle-based) decimation modes, for example, "remove all duplicates but protect static scenes".

Feedback will be appreciated.
Guest is offline   Reply With Quote
Old 16th May 2003, 03:19   #38  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
oops, get 1.0.7

Please get version 1.0.7. Version 1.0.6 is setting incorrect frame rates for global decimation modes.
Guest is offline   Reply With Quote
Old 12th June 2003, 14:07   #39  |  Link
IanB
Avisynth Developer
 
Join Date: Jan 2003
Location: Melbourne, Australia
Posts: 3,167
Decimate on noisy source material

Donald,

Do you intend to include any of the new Telecide() v5 advances into Decimate()?

I have an interest in processing certain 42 minute 80 Mb DivX files from 30 fps crude deinterlaced into smooth flowing 25 fps suitable for watching on a pal TV. I generally use a script of the form :-

AviSource("divx.avi", True, "YUY2")
Decimate(mode=0, cycle=5)
# resize, levels, etc
AssumeFPS(25, 1, True)
ResampleAudio(44100)

Given the original program material is true 24 fps film, that been 3:2 telecine broadcast, either single field captured or bash deinterlaced, then heavily compressed. I end up with an avi file that has a strong 5 frame cycle (4 frames then a "sort of" duplicate).

In scenes with strong motion Decimate() work very well, however when the motion is less strong Decimate() tends to make poor choices for the duplcate frame to remove. This causes noticable motion jitter, I can fix it with an override file but that means watching to find the jittery bits, fixing, then watching it for real. Given the material is "watch once" it's not worth the extra effort.

Running Decimate(mode=2, cycle=5, show=true) [because the output is in a convienient form] and examining the metrics it is fairly obvious why this happens. The delta values due to noise is about the same as the normal motion (or lack) deltas. Typical delta stats of a jittery stream versus a smooth stream are :-

in frm 121, use frm 152
150: 1.18
151: 1.14
152: 1.09 <- True duplicate
153: 1.11
154: 1.06 <- decimate mode=0 wrong choice

in frm 145, use frm 182
180: 1.80
181: 2.06
182: 1.05 <- strong difference
183: 2.06
184: 2.13

The problem as I see it is that Decimate() has no concept of "these delta's are to close to make a sane choice, use previous pattern to choose" it needs a sort of inertia to ride thru the quite parts.

Any chance of a new mode with such inertia?

IanB

P.S. Any chance of printing all the metrics in mode=0 like you do in mode=2?
IanB is offline   Reply With Quote
Old 14th June 2003, 00:25   #40  |  Link
Guest
Guest
 
Join Date: Jan 2002
Posts: 21,901
@IanB

My plate is so full handling decent video properly that it is doubtful I'll ever find time to address things like repairing videos that have been misprocessed. I hope you can appreciate my time limitations.
Guest is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 19:59.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.