Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
4th April 2016, 10:47 | #37361 | Link | |
Registered User
Join Date: Mar 2009
Posts: 3,650
|
Quote:
|
|
4th April 2016, 11:17 | #37362 | Link |
Registered User
Join Date: Nov 2012
Posts: 6
|
Hello everyone.
My 60" 1080p plasma died on me a few weeks ago then i bought a new UHD 4K 60" screen a few days ago and now i have trouble to watch almost everything, framedrops, stutter, i will explain the situation, my receiver Onkyo TX-NR525 can do only 4K@30hz and htpc video card is gtx 670 so i just want to play my videos smoothly.. is it still possible? I'm using shark007 codecs + mpc-be madvr x86 + reclock I'll uninstall everything and use the mpc-be madvr 64bits edition now without reclock. I hope it works. Any advice would be really appreciated. Thanks in advance. Kind Regards, Queiroz |
4th April 2016, 12:23 | #37363 | Link |
Guest
Posts: n/a
|
I enlighten a lot of people about madVR and many of them ask about what HTPC they should build JUST for film playback. I usually advice Intel i5 + GeForce GTX 960 due to its overall good performance and H.265 Hardware Rendering capabilities.
Assume the system is going to use the latest versions of MPC-HC x86 (interval LAV filters disabled), ReClock, LAV Filters (external x86 + x64 install package), and madVR. Would i5 + GTX 960 run with settings listed below without issues? LAV Video Settings: Hardware Decoder - DXVA2 (Copy-Back) - All Resolutions (SD, HD, UHD 4K, H.264, HEVC, VC-1, MPEG-2, DVD, VP9), except for MPEG-4 Output Formats - All enabled, except for AYUV. RGB Output Levels - Untouched (as input). Hardware/GPU Deinterlacing - Disabled (No interlacing). Software Deinterlacing Algorithm - No software deinterlacing (disabled). madVR Settings for 1080p: Devices Properties - 0-255, 10bit, no 3D, HDR - 120nit Calibration - Rec. 709 3DLUT + Rec. 601 3DLUT Display Modes - Treat 25p movies as 24p, 1080p23, 1080p24, 1080p59, 1080p60 Processing De-Interlacing - Default Artifact Removal - Enabled, Low (Default strength) + Medium (Fade in/out strength) for High Quality content, Medium (Default strength) & High (Fade in/out strength) for LQ content Image Enhancements - None enabled Zoom Control - None enabled Scaling Algorithms Chroma Upscaling - NNEDI3 32n + SuperRes w/ Strength 3 Image Downscaling - SSIM 2D w/ Strength 100% + AR Relaxed + Scale in LL Image Doubling - None enabled Image Upscalnig - Jinc + AR Image Refinement - SuperRes w/ Strength 3, Refine the image every ~2x upscaling step Rendering General Settings - Exclusive FullScreen, Use D3D11 for presentation, Present a new frame for every V-sync, CPU Queue - 16, GPU Queue - 8 Windows Mode - Present several frames in advance, video frames presented in advance - 3 Exclusive Mode - Present several frames in advance, video frames presented in advance - 3 Stereo 3D - None enabled Smooth Motion - None enabled Dithering - ED Option 2, Use Colored Noise, Change dither for every frame Trade Quality for Performance - None enabled Questions: - I know madVR is not very CPU-dependent, but would Intel i3 CPU instead of Intel i5 CPU handle such high quality settings? - If the settings I provided would not provide a problem-free performance, then: A. What minimum specs would you recommend for HTPC that would use the settings listed above without any dropped/delayed frames or presentation issues? B. What settings would you change/advice to change to make Intel i5 (or Intel i3) + GTX 960 run problem-free and yet maintain the highest quality possible for such a rig? Last edited by XMonarchY; 4th April 2016 at 21:18. |
4th April 2016, 14:24 | #37364 | Link | |
Guest
Posts: n/a
|
Quote:
Last edited by XMonarchY; 4th April 2016 at 14:48. |
|
4th April 2016, 14:48 | #37365 | Link |
Registered User
Join Date: Nov 2012
Posts: 6
|
I've uninstalled everything here...
Now i'm using MPC-BE with MadVR x64 and LAV codecs x64 and after using the settings posted by the user XMonarchY, it is finally working for me.. i mean my 1080p videos are working just fine w/ no stutter.. i just have one question about upscaling, now when i press "info" on the TV remote it says "1920x1080 / 24p" the tv still does the upscaling from hd and lower content ? sorry if i'm asking in the wrong section but any help would be appreciated... also i'm using DXVA Cuda which seems to be working, do i have to change to Copyback? Windows 10 x64 pro here with latest nvidia drivers |
4th April 2016, 17:39 | #37366 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
DXVA copyback is better than CUDA now but you do not have to change it.
It sounds like you have Windows set to output 1080p to your TV. If you want madVR to upscale to 4K (recommended) then you need to set the resolution to 4K in Windows.
__________________
madVR options explained |
4th April 2016, 17:55 | #37368 | Link | |
Registered User
Join Date: Nov 2012
Posts: 6
|
Quote:
My windows is set to 3840 x 2160 / 30p (due to my receiver limitation) because if i plug my video card direct to the TV it display 4K@60hz anyways i can't upgrade my receiver so i have to find a way to play my videos,shows,movies.. but i think MadVr changes the resolution because of the option that XMonarchY posted.. i've allowed Madvr to autochange the resolution to 1080p23, 1080p24, 1080p59 and 1080p60 and it changes to 1080p24 almost everytime... it is working just fine i just want to know if the TV itself is doing any upscale or if i have to do it using madvr options. |
|
4th April 2016, 18:30 | #37371 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,923
|
@XMonarchY
don't use 10 bit output. the number of 10 bit panels out there so unbelievable low. and samsung and sony are not revealing there panel bit. for the simple reason that good processing with an 8 bit panel is better than a poor processing with an 10 bit panel. and you forget something very very important and that is resolution. |
4th April 2016, 21:24 | #37372 | Link | |
Guest
Posts: n/a
|
Quote:
Many plasma sets can't do proper 10bit or can do it only at 16-235, but I've tested my CCFL LCD (12bit input in NVidia CP) with 10bit tests and it passed all of them. There's no gradation/banding or at least a strong reduction compared to 8bit. On MY TV, 12bit input in NVidia CP (10bit colors + 2bit from internal processing) and 10bit in madVR is the best option. If I were to advice someone something like Panasonic ST/VT/ZT series, then of course I would suggest to select 8bit in NVidia CP and 8bit in madVR . |
|
5th April 2016, 03:20 | #37373 | Link | |
Registered User
Join Date: Nov 2014
Posts: 81
|
Quote:
(1) can be done via HDMI 2.0 to HDMI 2.0 if your gfx card supports HDMI 2.0, or via DP-to-HDMI 2.0 adapter. (2) can be done even with a DVI to HDMI cable. |
|
5th April 2016, 03:24 | #37374 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,923
|
Quote:
or if you have an iGPU try to connect the iGPU to the receiver and direct connect the TV. |
|
5th April 2016, 10:12 | #37375 | Link |
Registered User
Join Date: Apr 2009
Posts: 1,019
|
Sorry that I haven't had a chance to look at any of the new anti-ringing options at all yet.
Thanks for the chroma scaling option to restore the old behavior, chroma quality is noticeably improved now - though it obviously comes at a performance cost. There seems to be a bug with linear light downscaling and mixed x/y scaling. I'll try to upload a sample this evening which demonstrates the problem. What's happening is that, with luma x downscaling and luma y upscaling, activating linear light downscaling results in a very dark image. If NNEDI3 luma doubling is activated under these conditions (no chroma) the image turns green. This is unaffected by the new chroma quality option. EDIT: It's the combination of sigmoidal light upscaling, and linear light downscaling that seems to cause this. Disabling either one fixes it. Last edited by 6233638; 5th April 2016 at 10:15. |
5th April 2016, 11:03 | #37376 | Link | |
Registered User
Join Date: Oct 2015
Posts: 88
|
Quote:
|
|
5th April 2016, 20:47 | #37378 | Link |
Registered User
Join Date: May 2014
Location: Ukraine
Posts: 25
|
Filter vs MadVR dithering
I use LAVFilter to decode video and it has it's own dithering options:So, as it seen, it has not "no dithering" mode. On the other side, MadVR has not "no dithering" mode too. Does it mean that dithering implements twice in this case? Should I use some other videofilter with MadVR instead of LAVFilter to avoid increased dithering artifacts and resource wasting?
As for dithering mode "none" in MadVR, it is said in it's description, however, that rounding implements in this case. So, that is not totally nothing very likely and is undesirable by the same reason as well. Is it right? Last edited by mysterix; 5th April 2016 at 21:00. |
5th April 2016, 20:57 | #37379 | Link | |
Registered User
Join Date: Apr 2006
Posts: 299
|
Quote:
|
|
5th April 2016, 20:58 | #37380 | Link | |
Registered User
Join Date: Sep 2012
Posts: 174
|
Quote:
unless you do YUV -> RGB conversion... so leave that on untouched, then only madVR is dithering.. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|