Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
15th March 2013, 04:54 | #18001 | Link | |
Registered User
Join Date: May 2009
Posts: 212
|
Quote:
Set it to "performance" + "VSync ON", and disable all other AA / texture filtering options, etc... The recent nVidia GPU driver since 304.xx adaptively controls the GPU's frequency depending on measured loading. The driver does not 100% honor the OS power management settings any more. This behavior is described in the driver's release note. To be honest, I cannot tell the difference between Lanczos3_AR and Jinc3_AR on 1280x720 anime content with 1920x1080 10-bit output signal to Sony 65HX920. (with super-resolution 50% , SBR on) Last edited by pie1394; 15th March 2013 at 10:57. |
|
15th March 2013, 05:38 | #18002 | Link | |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
Quote:
Anyway, you can get OLED for a grand right here right now and it natively supports 720p@24Hz too: http://www.avsforum.com/t/1426822/ |
|
15th March 2013, 14:17 | #18003 | Link | |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
It's a high end broadcast monitor, not a consumer-grade display.
Quote:
|
|
15th March 2013, 20:18 | #18004 | Link |
Registered User
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
|
Good to know you figured out the issue artios, turned out to be more simple than we thought. I have one more question for everyone, this I'm really lost since I have no idea what could cause it.
"I cant seem to get the "CUVID Hardware/GPU Deinterlacing" to work. I set it to "Enable Adaptive HW Deinterlacing" and set it to the "50p/60p (Video)" option as well as enabled "high-Quality Processing". I'm running a Nvidia 670 so i know it is not a horsepower issue. I more want to just see if the video playback is smoother since it ups the videos to 60fps. Anyone know how to get this working?" I know any sort of hardware acceleration won't work with 10-bit, but it doesn't really seem to be the case there, so I'm also asking for directions in that one. Any help is appreciated as always. Oh- I also did a search in the forum and around Google before asking, so I'm fairly confident this is a "new" issue. Both of them are using KCP pack which has madVR (0.86.1), LAV Filters (0.55.3), and MPC-HC lite.
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack Last edited by Niyawa; 16th March 2013 at 01:32. Reason: Removed something that was already answered. |
16th March 2013, 04:48 | #18005 | Link |
Registered User
Join Date: Jan 2002
Posts: 1,264
|
Anyone able to save images in MPC-HC with this version of MadVR? I am using latest LAV filters too. All I get is a blank image. I have to switch to EVR to cap a still. Overlay is unticked btw (read back a bit).
Nvidia latest drivers and GTX560Ti Last edited by oddball; 16th March 2013 at 05:03. |
16th March 2013, 06:04 | #18007 | Link |
Registered User
Join Date: Jul 2008
Posts: 60
|
I saw there was a discussion a few pages back about the most power friendly setup for madvr, but what about lav set to intel quicksync while using a discrete graphics card to handle the upscaling?
I ask because the stupid intel integrated graphics causes blue screens a lot if I turn off my tv since the dummy monitor then takes over and doesn't know what to do when I turn the tv on. This has been an ongoing issue through several drivers, different versions of windows installs and several clean ones at that. If using dxva is better, then I may just turn off the integrated graphics altogether. Does anyone know? |
16th March 2013, 06:06 | #18008 | Link | |
Registered User
Join Date: May 2009
Posts: 212
|
Quote:
Although it is not HW limitation, I don't think the CUDA deinterlacing function can be independently used while CUDA video decoder is NOT used. Just curious if nVidia CUDA SDK actually allows it or not... |
|
16th March 2013, 07:05 | #18009 | Link | |
Registered User
Join Date: Dec 2012
Location: Neverland, Brazil
Posts: 169
|
Quote:
__________________
madVR scaling algorithms chart - based on performance x quality | KCP - A (cute) quality-oriented codec pack |
|
16th March 2013, 12:01 | #18011 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
this makes sense for really powerfull gpu using dxva on an 680 gtx resultants in a high power state while normal madvr scailing is not that demanding and the gpu stays a a lower powerstate.
the igpu is allways very low even under full load. |
16th March 2013, 12:53 | #18013 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
|
DXVA forces the GPU into (at least) medium power state, iirc.
Might be GPU dependent.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
16th March 2013, 14:55 | #18015 | Link |
Registered User
Join Date: Mar 2007
Posts: 934
|
You can do that using nVidia Inspector I think (if you have an nVidia card of course). My GT430 isn't powerful enough for HD playback using CUDA unless it's in the highest power state though so it was useless for me. I use DXVA2 Native now anyway.
__________________
TV Setup: LG OLED55B7V; Onkyo TX-NR515; ODroid N2+; CoreElec 9.2.7 |
16th March 2013, 15:21 | #18016 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,926
|
u can create for amd a profile with max clock people use this to force the max power state u can use this to force medium power state.
this does not work with every card my 6770 stays always at medium power state as long as dxva is in use. i don't know about other cards with the same "problem". |
16th March 2013, 15:27 | #18017 | Link | |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
Quote:
The only way I have been able to do that, is to force the card into the low power state with Nvidia Inspector, but that can cause other problems. At least, that's what I have found when using CPU decoding and testing with very low resolution (non-demanding) videos played at 100% size. Even when GPU load is below 5%, it never drops down from the medium (P8) power state, into the low (P12) state. You really do need to measure these things with some kind of meter though, as it's easy to assume something will lower power consumption when it may not. If the GPU is going to be stuck in the P8 state regardless, it might be more efficient to use DXVA than QuickSync - even if QuickSync should technically be more efficient than DXVA decoding. If CUVID forces your video card into the full power (P0) state unnecessarily, then it may actually be more efficient to use CPU decoding for example. (though I would recommend DXVA) Why is it that you don't want to use DXVA? |
|
16th March 2013, 17:52 | #18019 | Link |
Registered User
Join Date: Aug 2004
Location: Canada
Posts: 860
|
Interesting, I tested out the CPU vs GPU decoding yet again, and my result was, since I mostly play 720p upscaled, it makes little difference and that CPU decoding has a slight edge. I balanced out my upscaling settings with Jinc+Jinc instead of Jinc+Bicubic as a result
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|