Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th January 2013, 16:22   #16981  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by pururin View Post
In real world scenarios, is it worth costing more CPU power for using SSRC over the default Libsamplerate?

I've read some posts on Hydrogenaudio stating that good quality resamplers aren't much different as long as they do their math right.
It's probably negligible to be honest, but I have the CPU cycles to spare, and was able to find at least one test (that may have no bearing on the real world) where SSRC performed better.

As Madshi says, it's far easier to see problems with video processing than it is to hear problems with audio processing.

Quote:
Originally Posted by Dodgexander View Post
Thanks Madshi. Can someone explain what the problem is to me? I have tried to scan through the thread to understand, but have had no success.

So far I notice that the problem relates to "film mode" rather than "video mode".

So film mode presumes theres no deinterlacing?
Video mode presumes there is?

How does this work with pal compared to ntsc?
Film mode assumes the content is supposed to be deinterlaced as a progressive frame (576p25 using 2:2 rather than 288p50 for example) or that it requires IVTC if it is a 59.94fps source. (3:2 pulldown applied to turn it into 480p24) That's not all it does, but is the general idea.
6233638 is offline   Reply With Quote
Old 17th January 2013, 16:39   #16982  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by Dodgexander View Post
Thanks Madshi. Can someone explain what the problem is to me? I have tried to scan through the thread to understand, but have had no success.

So far I notice that the problem relates to "film mode" rather than "video mode".

So film mode presumes theres no deinterlacing?
Video mode presumes there is?

How does this work with pal compared to ntsc?
In "PAL" video streams (1080i/25 or 576i/25), there's basically only two types of video you can get: true interlaced content (1080i/25 or 576i/25), or progressive content (1080p/25 or 576p/25). With interlaced content you want to deinterlace using the best algorithm available (e.g. vector/motion adaptive) to get a progressive video suitable for display (1080p/50 or 576p/50). For progressive content, you want to simply merge pairs of fields together because they belong to a single frame with no motion (i.e. weave deinterlacing).

However, when you tell a GPU to perform deinterlacing, it should detect whether the video is actually progressive and thus weave needs to be used rather than vector/motion adaptive. Therefore, you can basically leave deinterlacing on all the time ("video mode") and it'd play back everything perfectly (25p content would be frame-doubled to 50p but this should have no impact on image quality).

Unfortunately, there are apparently some combinations of GPUs and progressive video content where this doesn't happen as intended. Therefore you need to force "film mode" for these to get proper weave deinterlacing so that quality isn't sacrificed by unnecessary vector/motion adaptive deinterlacing.

I don't believe I have any such content (or my GPUs aren't affected) so I can't test this.

Quote:
Originally Posted by 6233638
Film mode assumes the content is supposed to be deinterlaced as a progressive frame (576p25 using 2:2 rather than 288p50 for example)...
Again, deinterlacing shouldn't halve vertical resolution. Obviously 576p/50 that has come from a 576i/25 source won't look as good as the 576p/50 original, but it'll look a hell of a lot better than 288p/50. This half-resolution thing is something going wrong and is not the norm for deinterlacing. For what it's worth, I have read that one or two TV models get this wrong too and fail to apply weave properly to progressive material so it's not exclusive to GPUs.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 17th January 2013, 17:45   #16983  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by DragonQ View Post
Again, deinterlacing shouldn't halve vertical resolution. Obviously 576p/50 that has come from a 576i/25 source won't look as good as the 576p/50 original, but it'll look a hell of a lot better than 288p/50. This half-resolution thing is something going wrong and is not the norm for deinterlacing. For what it's worth, I have read that one or two TV models get this wrong too and fail to apply weave properly to progressive material so it's not exclusive to GPUs.
What should, and what does happen, are often very different things. Proper 2:2 cadence detection seems difficult to get right, and it's not "one or two TV models" that get this wrong - the majority of televisions don't even attempt 2:2 cadence detection and just treat everything as "288p" video content.

If they do, it's often listed as an optional "film mode" option that is off by default, because wrongly jumping into or out of film/video mode looks terrible. Even Pioneer, whom everyone seems to praise, did a terrible job with film-type PAL content.
6233638 is offline   Reply With Quote
Old 17th January 2013, 18:14   #16984  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
FWIW, I've turned film mode detection off on my JVC X3 projector, because it often switched to film mode when watching soccer, with terrible results. I guess only Gennum und VXP processors have a reasonably well working 2:2 auto film mode detection.
madshi is offline   Reply With Quote
Old 17th January 2013, 18:28   #16985  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by madshi View Post
FWIW, I've turned film mode detection off on my JVC X3 projector, because it often switched to film mode when watching soccer, with terrible results. I guess only Gennum und VXP processors have a reasonably well working 2:2 auto film mode detection.
I seem to recall even having problems when I had a Lumagen Radiance, and I managed to convince DVDO to implement a forced film mode on one of the products I did testing for, because auto-detection rarely works satisfactorily, and few people seem to watch video-type DVDs - they're mostly films.

madVR's film mode has worked very well for me since you made some improvements to the cadence detection after the first few releases.
6233638 is offline   Reply With Quote
Old 17th January 2013, 18:39   #16986  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by 6233638 View Post
What should, and what does happen, are often very different things. Proper 2:2 cadence detection seems difficult to get right, and it's not "one or two TV models" that get this wrong - the majority of televisions don't even attempt 2:2 cadence detection and just treat everything as "288p" video content.
Nothing is treated as 288p, I don't know what you're talking about here. Do you have evidence for this claim? The fact that I only found one or two models where this problem is actually mentioned seems to suggest it's against the norm.

Quote:
Originally Posted by 6233638 View Post
If they do, it's often listed as an optional "film mode" option that is off by default, because wrongly jumping into or out of film/video mode looks terrible. Even Pioneer, whom everyone seems to praise, did a terrible job with film-type PAL content.
Yes, leaving "film mode" on makes interlaced content look horrible - far worse than deinterlacing progressive material would look if the deinterlacer is working correctly. Again, I haven't encountered this (on my HTPC or on the few HDTVs I've used), yet you seem to be suggesting it's extremely common. Even madshi said your sample was a rare case.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 17th January 2013, 19:05   #16987  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by DragonQ View Post
Nothing is treated as 288p, I don't know what you're talking about here. Do you have evidence for this claim? The fact that I only found one or two models where this problem is actually mentioned seems to suggest it's against the norm.
If you try and perform any kind of complex deinterlacing to get a 576p50 signal, from a 576i video source, you will have deinterlacing artifacts.

576i neatly becomes 288p50, and while you lose half the resolution with film-type (progressive) content, most people think it looks fine.

If you allow for switching between video and film-type deinterlacing, it will invariably make the wrong guess at some point, switch to the wrong mode, and the result is a disaster that stutters, is full of combing artifacts or aliasing, or jumps between a high and low resolution image noticeably.

It's far easier to treat all interlaced content (at least 576i, perhaps not 1080i) as 288p and you never have any artefacts, other than a softer, lower resolution image. One with hard aliasing like the example I posted above is less common.

Quote:
Originally Posted by DragonQ View Post
Yes, leaving "film mode" on makes interlaced content look horrible - far worse than deinterlacing progressive material would look if the deinterlacer is working correctly. Again, I haven't encountered this (on my HTPC or on the few HDTVs I've used), yet you seem to be suggesting it's extremely common. Even madshi said your sample was a rare case.
But enabling the "film mode" option on a television doesn't say "treat this input as progressive content" all it does is enable the display to attempt 2:2 cadence detection - of which I have never seen a television do a good job, if it even has the option. Even if they do a good job of displaying films without artifacts, they tend to incorrectly identify video content as film-type and switch to the wrong mode, with very bad results, so you can't leave the option enabled all the time - and most people don't want to be jumping into the menus to toggle it on/off all the time, so it gets left off.
6233638 is offline   Reply With Quote
Old 17th January 2013, 19:28   #16988  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by 6233638 View Post
If you try and perform any kind of complex deinterlacing to get a 576p50 signal, from a 576i video source, you will have deinterlacing artifacts.

576i neatly becomes 288p50, and while you lose half the resolution with film-type (progressive) content, most people think it looks fine.
I still don't know what you mean. Performing "complex deinterlacing" doesn't get you 288p/50 video. Performing bob deinterlacing would have that effect but that is clearly not what is meant by proper "complex deinterlacing".
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 17th January 2013, 19:33   #16989  |  Link
Jong
Registered User
 
Join Date: Mar 2007
Location: London, UK
Posts: 576
Quote:
Originally Posted by 6233638 View Post
It's far easier to treat all interlaced content (at least 576i, perhaps not 1080i) as 288p and you never have any artefacts, other than a softer, lower resolution image.
are you really suggesting playing all PAL DVDs at half resolution?!

I have to say I've never had a problem with either the MPC-HC or, now, MadVR handling PAL movies. Maybe the odd TV show, but going half resolution on all movies (90+% of my viewing) would be mad!
Jong is offline   Reply With Quote
Old 17th January 2013, 20:08   #16990  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by DragonQ View Post
I still don't know what you mean. Performing "complex deinterlacing" doesn't get you 288p/50 video. Performing bob deinterlacing would have that effect but that is clearly not what is meant by proper "complex deinterlacing".
Bob deinterlacing is half resolution.

With interlacing you only get full resolution at half framerate, or half resolution at full framerate. Never both.

"Complex deinterlacing" would be motion compensated deinterlacing, which attempts to create a 576p50 image from a 576i50 source. This type of deinterlacing always results in artifacts.

Quote:
Originally Posted by Jong View Post
are you really suggesting playing all PAL DVDs at half resolution?!

I have to say I've never had a problem with either the MPC-HC or, now, MadVR handling PAL movies. Maybe the odd TV show, but going half resolution on all movies (90+% of my viewing) would be mad!
No, I'm saying that most televisions play back 576i sources at half resolution, and don't even attempt 2:2 cadence detection, because it means you're never going to have deinterlacing artifacts, and most people don't notice.

If you have the option, it's usually called something like "film mode" which enables the TV to attempt 2:2 cadence detection. This means that you may have films played back at full resolution, but it is likely to drop out of cadence on occasion, and attempt 2:2 when watching video-type content, which results in ugly artifacts - the reason why if the set has that option at all, it is disabled by default.


With madVR you have the luxury of being able to force film mode, which works the vast majority of the time.

Last edited by 6233638; 17th January 2013 at 20:13.
6233638 is offline   Reply With Quote
Old 17th January 2013, 21:22   #16991  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by 6233638 View Post
Bob deinterlacing is half resolution.

With interlacing you only get full resolution at half framerate, or half resolution at full framerate. Never both.
Correct in terms of the pure information that's included in the signal.

Quote:
Originally Posted by 6233638 View Post
"Complex deinterlacing" would be motion compensated deinterlacing, which attempts to create a 576p50 image from a 576i50 source. This type of deinterlacing always results in artifacts.
Yes, you get some artefacts from the interpolation, but it still looks mightily better than bob deinterlacing. I'd never consider deinterlaced content to be "half resolution" - modern deinterlacers are much better than that.

Quote:
Originally Posted by 6233638 View Post
No, I'm saying that most televisions play back 576i sources at half resolution, and don't even attempt 2:2 cadence detection, because it means you're never going to have deinterlacing artifacts, and most people don't notice.
BS. There is no way that "most" televisions don't go beyond bob deinterlacing, that is easy to notice and I have rarely seen it. On the rare occasions I have seen it, it was usually in the source material.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 17th January 2013, 22:47   #16992  |  Link
ajp_anton
Registered User
 
ajp_anton's Avatar
 
Join Date: Aug 2006
Location: Stockholm/Helsinki
Posts: 805
Quote:
Originally Posted by madshi View Post
Telecined PAL content is 50i. It absolutely exists. E.g. I have samples of a PAL DVD where each encoded frame contains the bottom field of the previous frame and the top field of the next frame. Without proper IVTC you'd either get combing or 50p output instead of 25p.
Out of curiosity, why does this even exist?
ajp_anton is offline   Reply With Quote
Old 17th January 2013, 22:54   #16993  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
I agree with 6233638 that the automatic film vs video detection for PAL content doesn't work that well in many of today's devices. At least that's my own experience.

However, I agree with DragonQ that "proper" video deinterlacing is much better than simple Bob. I don't know of any consumer hardware solution which uses motion compensation for video mode deinterlacing (not sure about what ATI and NVidia do, though, maybe they do some sort of motion compensation, I don't know). The deinterlacer hardware chips in TVs, Receivers, Blu-Ray players etc all use motion-adaptive deinterlacing with some sort of diagonal filtering. The key feature of motion-adaptive deinterlacing is that it detects static vs. moving image areas. For static image areas you can safely weave the fields together without getting artifacts and as a result for static image areas you get full progressive resolution. For moving parts most implementations use a Bob scaling algorithm with diagonal filtering, which produces results which are better than a Bob algorithm based on simple Bilinear scaling.

I would guess that if you used something like Jinc on every separate field (like 6233638 seems to suggest), you'd get better image quality for *moving* parts than what the typical CE deinterlacer chip does. However, if you don't weave static image areas then you're going to lose a lot of resolution during "quiet" scenes.

Quote:
Originally Posted by ajp_anton View Post
Out of curiosity, why does this even exist?
Because some studios/encoding houses don't know what they're doing.
madshi is offline   Reply With Quote
Old 17th January 2013, 22:57   #16994  |  Link
dukey
Registered User
 
Join Date: Dec 2005
Posts: 560
It exists because someone fucked up basically. It is and it isn't fixable. You can shift the fields back into the frames they should be in. But often you'll be left with residual interlacing. So really should deinterlace it after field shifting.

IVTC is only for NTSC material, it has no meaning in the PAL world.
dukey is offline   Reply With Quote
Old 17th January 2013, 22:58   #16995  |  Link
Pomegranate
Registered User
 
Pomegranate's Avatar
 
Join Date: Oct 2012
Posts: 66
Quote:
Originally Posted by ajp_anton View Post
Out of curiosity, why does this even exist?
24p -> 50i , no audio speed-up required.
Pomegranate is offline   Reply With Quote
Old 17th January 2013, 23:07   #16996  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by dukey View Post
IVTC is only for NTSC material, it has no meaning in the PAL world.
I totally disagree. Both wikipedia and NVidia are using the term "Telecine" for PAL content, too:

http://en.wikipedia.org/wiki/Telecine
http://www.nvidia.com/object/IO_26271.html

IVTC just means "Inverse Telecine", which is the proper term for finding the correct fields and weaving them back together. Which is what madVR does for PAL content, too, when you enable film mode.
madshi is offline   Reply With Quote
Old 17th January 2013, 23:36   #16997  |  Link
DragonQ
Registered User
 
Join Date: Mar 2007
Posts: 934
Quote:
Originally Posted by madshi View Post
(not sure about what ATI and NVidia do, though, maybe they do some sort of motion compensation, I don't know)
They all do better than this.

Quote:
Originally Posted by madshi View Post
The deinterlacer hardware chips in TVs, Receivers, Blu-Ray players etc all use motion-adaptive deinterlacing with some sort of diagonal filtering. The key feature of motion-adaptive deinterlacing is that it detects static vs. moving image areas. For static image areas you can safely weave the fields together without getting artifacts and as a result for static image areas you get full progressive resolution. For moving parts most implementations use a Bob scaling algorithm with diagonal filtering, which produces results which are better than a Bob algorithm based on simple Bilinear scaling.
That sounds about right for motion adaptive deinterlacing but modern GPUs use vector adaptive deinterlacing, which is far better at reproducing the "original" full resolution progressive video than motion adaptive deinterlacing. Both the GTS250 and GT430 do some form of vector adaptive deinterlacing.

These examples are from the Cheese Slice test:

__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7
DragonQ is offline   Reply With Quote
Old 18th January 2013, 01:11   #16998  |  Link
xabregas
Registered User
 
Join Date: Jun 2011
Posts: 121
Quote:
Originally Posted by madshi View Post


That's hard to say. Personally, I'd take the NVidia for various reasons. But maybe next week I'll have a different opinion, I don't know...

I got the ati, i read it too late

And im paying for that already, this gpu drivers are making me crazy, madvr is not smooth at all, i disabled everything i found in CCC video section and still sometimes is good and smooth, others is not smooth...

Maybe is powerplay changing the clocks for power saving or other crap saving, how can i disable that waste of technology in CCC?? I think the problem is CCC, i can see the clocks go to 1000HZ and then drop to 300 and up and thats making madvr exclusive mode unstable

Anyway, it could also be 3d settings, because madvr exclusive mode is 3D and CCC has some amazing 3d settings that simply cant be disabled, some of them i can choose "let 3d aplllication decide", but there are 2 3d options without the "let 3d apllication decides" and there is no option to disable them

I disabled the scaling in CCC and IVT (dunno i dont know what it does, better stay off)

only have pulldown detection checked to auto, forcing smooth playback and thats all...

still need to disable the powerplay garbage??

anyway, i made a reset to madvr and only changed upscaling methods to lanczo 4 taps (AR enabled). Thats my settings...
plus lav filters + mphc and reclock for wasapi only...

Ah, i have aero disabled. With aero enabled i only get errors, f&$/&$/ aero

Lets see...

its a 7770, it must handle madvr without problems...i`m scaling 720p videos to 1360x768 resolution, so its not so big upscaling i think

I did change the buffer in exclusive mode settings several times to 10 and 8, maybe it was that, i have amd phenom x4 965 BE OC at 4GHZ

TIA
xabregas is offline   Reply With Quote
Old 18th January 2013, 01:34   #16999  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
xabregas, what driver are you running with your HD7770? Also, which decoder, player and what OS?
noee is offline   Reply With Quote
Old 18th January 2013, 01:39   #17000  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by DragonQ View Post
Obviously 576p/50 that has come from a 576i/25 source won't look as good as the 576p/50 original, but it'll look a hell of a lot better than 288p/50. This half-resolution thing is something going wrong and is not the norm for deinterlacing.
It's only recently that I've gained interest in interlaced material, but CUVID does a hell of a job deinterlacing 29.97fps video material to 59.94fps. Quite frankly, it looks fantastic to me. James Cameron keeps saying that frame rate is more important than resolution(tring to push for 1080/48p instead of 4K/24p) and it sure doesn't look 288p to me

Now how much resolution do we lose when watching a NTSC (video mode) DVD deinterlaced to 59.94fps? GPU's use fairly advanced algorithms IIRC.
leeperry is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:16.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.