View Single Post
Old 18th January 2013, 02:28   #17003  |  Link
Dodgexander
Registered User
 
Join Date: Jul 2008
Posts: 157
Quote:
Originally Posted by DragonQ View Post
In "PAL" video streams (1080i/25 or 576i/25), there's basically only two types of video you can get: true interlaced content (1080i/25 or 576i/25), or progressive content (1080p/25 or 576p/25). With interlaced content you want to deinterlace using the best algorithm available (e.g. vector/motion adaptive) to get a progressive video suitable for display (1080p/50 or 576p/50). For progressive content, you want to simply merge pairs of fields together because they belong to a single frame with no motion (i.e. weave deinterlacing).

However, when you tell a GPU to perform deinterlacing, it should detect whether the video is actually progressive and thus weave needs to be used rather than vector/motion adaptive. Therefore, you can basically leave deinterlacing on all the time ("video mode") and it'd play back everything perfectly (25p content would be frame-doubled to 50p but this should have no impact on image quality).

Unfortunately, there are apparently some combinations of GPUs and progressive video content where this doesn't happen as intended. Therefore you need to force "film mode" for these to get proper weave deinterlacing so that quality isn't sacrificed by unnecessary vector/motion adaptive deinterlacing.

I don't believe I have any such content (or my GPUs aren't affected) so I can't test this.
Okay thats clear, but how is that related to my problem?

If a channel switches from 1080/25p too 1080/25i, if using dxva deinterlacing, shouldn't it detect it needs to go from "no deinterlacing, or "weave" as you describe from the progressive source. Too "vector" for the interlaced source?

Because in my case it doesn't. It always thinks the video is progressive, even when I change the deinterlacing in the AMD control panel to vector or motion, there is no deinterlacing, or just "weave" deinterlacing for the 1080/25i video.

So since this just happens with Madvr, is this a communication problem between the renderer and the amd driver?

Why doesn't it work to set the deinterlacing to vector manually when this occurs?

Why does it work forcing deinterlacing on in Madvr?

If it could detect the change, should it be sending the information to change the deinterlacing type too the driver?
Dodgexander is offline   Reply With Quote