Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 4th April 2016, 10:47   #37361  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,650
Quote:
Originally Posted by Betroz View Post
When I use Super-Xbr for image doubling/quad and there is a scene when peoples faces are up close, their faces are nice and sharp, but when in the background their faces get much more grainy. With NNEDI3 64 and up, faces in the background are more sharp than with super-xbr. So this is normal? Kind of bugs me out cause I want things to be perfect
Sounds normal to me, you can supplement super-xbr with some additional sharpening though and find the right mix of options to give you the effect you're after.
ryrynz is offline   Reply With Quote
Old 4th April 2016, 11:17   #37362  |  Link
Queiroz
Registered User
 
Queiroz's Avatar
 
Join Date: Nov 2012
Posts: 6
Hello everyone.

My 60" 1080p plasma died on me a few weeks ago then i bought a new UHD 4K 60" screen a few days ago and now i have trouble to watch almost everything, framedrops, stutter, i will explain the situation, my receiver Onkyo TX-NR525 can do only 4K@30hz and htpc video card is gtx 670 so i just want to play my videos smoothly.. is it still possible?

I'm using shark007 codecs + mpc-be madvr x86 + reclock

I'll uninstall everything and use the mpc-be madvr 64bits edition now without reclock.

I hope it works.

Any advice would be really appreciated.

Thanks in advance.

Kind Regards,
Queiroz
Queiroz is offline   Reply With Quote
Old 4th April 2016, 12:23   #37363  |  Link
XMonarchY
Guest
 
Posts: n/a
I enlighten a lot of people about madVR and many of them ask about what HTPC they should build JUST for film playback. I usually advice Intel i5 + GeForce GTX 960 due to its overall good performance and H.265 Hardware Rendering capabilities.

Assume the system is going to use the latest versions of MPC-HC x86 (interval LAV filters disabled), ReClock, LAV Filters (external x86 + x64 install package), and madVR. Would i5 + GTX 960 run with settings listed below without issues?

LAV Video Settings
:

Hardware Decoder - DXVA2 (Copy-Back) - All Resolutions (SD, HD, UHD 4K, H.264, HEVC, VC-1, MPEG-2, DVD, VP9), except for MPEG-4
Output Formats - All enabled, except for AYUV.
RGB Output Levels - Untouched (as input).
Hardware/GPU Deinterlacing - Disabled (No interlacing).
Software Deinterlacing Algorithm - No software deinterlacing (disabled).

madVR Settings for 1080p:

Devices
Properties - 0-255, 10bit, no 3D, HDR - 120nit
Calibration - Rec. 709 3DLUT + Rec. 601 3DLUT
Display Modes - Treat 25p movies as 24p, 1080p23, 1080p24, 1080p59, 1080p60

Processing
De-Interlacing - Default
Artifact Removal - Enabled, Low (Default strength) + Medium (Fade in/out strength) for High Quality content, Medium (Default strength) & High (Fade in/out strength) for LQ content
Image Enhancements - None enabled
Zoom Control - None enabled

Scaling Algorithms

Chroma Upscaling - NNEDI3 32n + SuperRes w/ Strength 3
Image Downscaling - SSIM 2D w/ Strength 100% + AR Relaxed + Scale in LL
Image Doubling - None enabled
Image Upscalnig - Jinc + AR
Image Refinement - SuperRes w/ Strength 3, Refine the image every ~2x upscaling step

Rendering
General Settings - Exclusive FullScreen, Use D3D11 for presentation, Present a new frame for every V-sync, CPU Queue - 16, GPU Queue - 8
Windows Mode - Present several frames in advance, video frames presented in advance - 3
Exclusive Mode - Present several frames in advance, video frames presented in advance - 3
Stereo 3D - None enabled
Smooth Motion - None enabled
Dithering - ED Option 2, Use Colored Noise, Change dither for every frame
Trade Quality for Performance - None enabled


Questions:
- I know madVR is not very CPU-dependent, but would Intel i3 CPU instead of Intel i5 CPU handle such high quality settings?
- If the settings I provided would not provide a problem-free performance, then:
A. What minimum specs would you recommend for HTPC that would use the settings listed above without any dropped/delayed frames or presentation issues?
B. What settings would you change/advice to change to make Intel i5 (or Intel i3) + GTX 960 run problem-free and yet maintain the highest quality possible for such a rig?

Last edited by XMonarchY; 4th April 2016 at 21:18.
  Reply With Quote
Old 4th April 2016, 14:24   #37364  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by Betroz View Post
You guys were right. A 980Ti for MadVr in a HTPC case is too much, with temps going up to 83C...so I swapped it out for my 970 card (and that gets hot enough).

When I use Super-Xbr for image doubling/quad and there is a scene when peoples faces are up close, their faces are nice and sharp, but when in the background their faces get much more grainy. With NNEDI3 64 and up, faces in the background are more sharp than with super-xbr. So this is normal? Kind of bugs me out cause I want things to be perfect
Yeah, for me even faces get too grainy and just look low-quality when I use XBR for Doubling, even though I do not use any sharpening features. NNEDI3 Luma Doubling is absolutely stunning and IMHO the strongest and the most image-improving aspect of madVR, aside from maybe ED dithering. With all other very high quality settings + SuperRes make it a hell of a lot better.

Last edited by XMonarchY; 4th April 2016 at 14:48.
  Reply With Quote
Old 4th April 2016, 14:48   #37365  |  Link
Queiroz
Registered User
 
Queiroz's Avatar
 
Join Date: Nov 2012
Posts: 6
I've uninstalled everything here...

Now i'm using MPC-BE with MadVR x64 and LAV codecs x64 and after using the settings posted by the user XMonarchY, it is finally working for me.. i mean my 1080p videos are working just fine w/ no stutter.. i just have one question about upscaling, now when i press "info" on the TV remote it says "1920x1080 / 24p" the tv still does the upscaling from hd and lower content ? sorry if i'm asking in the wrong section but any help would be appreciated... also i'm using DXVA Cuda which seems to be working, do i have to change to Copyback? Windows 10 x64 pro here with latest nvidia drivers
Queiroz is offline   Reply With Quote
Old 4th April 2016, 17:39   #37366  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
DXVA copyback is better than CUDA now but you do not have to change it.

It sounds like you have Windows set to output 1080p to your TV. If you want madVR to upscale to 4K (recommended) then you need to set the resolution to 4K in Windows.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 4th April 2016, 17:51   #37367  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by XMonarchY View Post
Questions:
- I know madVR is not very CPU-dependent, but would Intel i3 CPU instead of Intel i5 CPU handle such high quality settings?
I have a dual core Ivy Bridge Pentium from 2013 coupled with GTX 960. In my use at least (1080p, 2D), the GPU is maxing out and not the CPU.

Last edited by Uoppi; 4th April 2016 at 17:55.
Uoppi is offline   Reply With Quote
Old 4th April 2016, 17:55   #37368  |  Link
Queiroz
Registered User
 
Queiroz's Avatar
 
Join Date: Nov 2012
Posts: 6
Quote:
Originally Posted by Asmodian View Post
DXVA copyback is better than CUDA now but you do not have to change it.

It sounds like you have Windows set to output 1080p to your TV. If you want madVR to upscale to 4K (recommended) then you need to set the resolution to 4K in Windows.
I see.

My windows is set to 3840 x 2160 / 30p (due to my receiver limitation) because if i plug my video card direct to the TV it display 4K@60hz anyways i can't upgrade my receiver so i have to find a way to play my videos,shows,movies..

but i think MadVr changes the resolution because of the option that XMonarchY posted.. i've allowed Madvr to autochange the resolution to 1080p23, 1080p24, 1080p59 and 1080p60 and it changes to 1080p24 almost everytime...

it is working just fine i just want to know if the TV itself is doing any upscale or if i have to do it using madvr options.

Queiroz is offline   Reply With Quote
Old 4th April 2016, 18:13   #37369  |  Link
BluesFanUK
Registered User
 
Join Date: Sep 2015
Posts: 60
Quote:
Originally Posted by Asmodian View Post
DXVA copyback is better than CUDA now but you do not have to change it.

It sounds like you have Windows set to output 1080p to your TV. If you want madVR to upscale to 4K (recommended) then you need to set the resolution to 4K in Windows.
I thought using the LAV video decoders like CUDA and DXVA weren't worth the hassle?
BluesFanUK is offline   Reply With Quote
Old 4th April 2016, 18:25   #37370  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
If your CPU is fast enough to play your source material you don't need to bother with hardware decoders. That hasn't changed.
sneaker_ger is offline   Reply With Quote
Old 4th April 2016, 18:30   #37371  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
@XMonarchY

don't use 10 bit output.

the number of 10 bit panels out there so unbelievable low.

and samsung and sony are not revealing there panel bit.
for the simple reason that good processing with an 8 bit panel is better than a poor processing with an 10 bit panel.

and you forget something very very important and that is resolution.
huhn is online now   Reply With Quote
Old 4th April 2016, 21:24   #37372  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by huhn View Post
@XMonarchY

don't use 10 bit output.

the number of 10 bit panels out there so unbelievable low.

and samsung and sony are not revealing there panel bit.
for the simple reason that good processing with an 8 bit panel is better than a poor processing with an 10 bit panel.

and you forget something very very important and that is resolution.
I didn't forget resolution. I mentioned it 4 times - 1080p23, 1080p24, 1080p59, 1080p60 , but I added another 1080p mention to the madVR Settings sub-title.


Many plasma sets can't do proper 10bit or can do it only at 16-235, but I've tested my CCFL LCD (12bit input in NVidia CP) with 10bit tests and it passed all of them. There's no gradation/banding or at least a strong reduction compared to 8bit. On MY TV, 12bit input in NVidia CP (10bit colors + 2bit from internal processing) and 10bit in madVR is the best option.

If I were to advice someone something like Panasonic ST/VT/ZT series, then of course I would suggest to select 8bit in NVidia CP and 8bit in madVR .
  Reply With Quote
Old 5th April 2016, 03:20   #37373  |  Link
bcec
Registered User
 
Join Date: Nov 2014
Posts: 81
Quote:
Originally Posted by Queiroz View Post
I see.

My windows is set to 3840 x 2160 / 30p (due to my receiver limitation) because if i plug my video card direct to the TV it display 4K@60hz anyways i can't upgrade my receiver so i have to find a way to play my videos,shows,movies..
I recommend you run two cables. One from your gfx card to your TV (1), the other from your gfx card to your receiver (2). From the perspective of your HTPC, it will think it has two monitors connected to it, your TV and Receiver. You will then use (2) for audio. This way, you can enjoy sending 4k @ 60Hz on your TV while still using your existing receiver for audio. This is my current setup.

(1) can be done via HDMI 2.0 to HDMI 2.0 if your gfx card supports HDMI 2.0, or via DP-to-HDMI 2.0 adapter.
(2) can be done even with a DVI to HDMI cable.
bcec is offline   Reply With Quote
Old 5th April 2016, 03:24   #37374  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,923
Quote:
Originally Posted by Queiroz View Post
I see.

My windows is set to 3840 x 2160 / 30p (due to my receiver limitation) because if i plug my video card direct to the TV it display 4K@60hz anyways i can't upgrade my receiver so i have to find a way to play my videos,shows,movies..

but i think MadVr changes the resolution because of the option that XMonarchY posted.. i've allowed Madvr to autochange the resolution to 1080p23, 1080p24, 1080p59 and 1080p60 and it changes to 1080p24 almost everytime...

it is working just fine i just want to know if the TV itself is doing any upscale or if i have to do it using madvr options.

try 2160p23, 2160p24, 2160p25, 2160p29, 2160p30, 1080p50, 1080p59, 1080p60.

or if you have an iGPU try to connect the iGPU to the receiver and direct connect the TV.
huhn is online now   Reply With Quote
Old 5th April 2016, 10:12   #37375  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Sorry that I haven't had a chance to look at any of the new anti-ringing options at all yet.
Thanks for the chroma scaling option to restore the old behavior, chroma quality is noticeably improved now - though it obviously comes at a performance cost.

There seems to be a bug with linear light downscaling and mixed x/y scaling.
I'll try to upload a sample this evening which demonstrates the problem.
What's happening is that, with luma x downscaling and luma y upscaling, activating linear light downscaling results in a very dark image.
If NNEDI3 luma doubling is activated under these conditions (no chroma) the image turns green.

This is unaffected by the new chroma quality option.

EDIT: It's the combination of sigmoidal light upscaling, and linear light downscaling that seems to cause this.
Disabling either one fixes it.

Last edited by 6233638; 5th April 2016 at 10:15.
6233638 is offline   Reply With Quote
Old 5th April 2016, 11:03   #37376  |  Link
XTrojan
Registered User
 
Join Date: Oct 2015
Posts: 88
Quote:
Originally Posted by bcec View Post
I recommend you run two cables. One from your gfx card to your TV (1), the other from your gfx card to your receiver (2). From the perspective of your HTPC, it will think it has two monitors connected to it, your TV and Receiver. You will then use (2) for audio. This way, you can enjoy sending 4k @ 60Hz on your TV while still using your existing receiver for audio. This is my current setup.

(1) can be done via HDMI 2.0 to HDMI 2.0 if your gfx card supports HDMI 2.0, or via DP-to-HDMI 2.0 adapter.
(2) can be done even with a DVI to HDMI cable.
The only issue with this is that 2 monitors are registered, won't the other unused monitor consume resources? And cause compability issues?
XTrojan is offline   Reply With Quote
Old 5th April 2016, 17:05   #37377  |  Link
bcec
Registered User
 
Join Date: Nov 2014
Posts: 81
Quote:
Originally Posted by XTrojan View Post
The only issue with this is that 2 monitors are registered, won't the other unused monitor consume resources? And cause compability issues?
Nope. At least I have not observed any drawbacks at all.
bcec is offline   Reply With Quote
Old 5th April 2016, 20:47   #37378  |  Link
mysterix
Registered User
 
Join Date: May 2014
Location: Ukraine
Posts: 25
Filter vs MadVR dithering

I use LAVFilter to decode video and it has it's own dithering options:So, as it seen, it has not "no dithering" mode. On the other side, MadVR has not "no dithering" mode too. Does it mean that dithering implements twice in this case? Should I use some other videofilter with MadVR instead of LAVFilter to avoid increased dithering artifacts and resource wasting?
As for dithering mode "none" in MadVR, it is said in it's description, however, that rounding implements in this case. So, that is not totally nothing very likely and is undesirable by the same reason as well. Is it right?
Attached Images
 

Last edited by mysterix; 5th April 2016 at 21:00.
mysterix is offline   Reply With Quote
Old 5th April 2016, 20:57   #37379  |  Link
markanini
Registered User
 
Join Date: Apr 2006
Posts: 299
Quote:
Originally Posted by 6233638 View Post
Sorry that I haven't had a chance to look at any of the new anti-ringing options at all yet.
Thanks for the chroma scaling option to restore the old behavior, chroma quality is noticeably improved now - though it obviously comes at a performance cost.

There seems to be a bug with linear light downscaling and mixed x/y scaling.
I'll try to upload a sample this evening which demonstrates the problem.
What's happening is that, with luma x downscaling and luma y upscaling, activating linear light downscaling results in a very dark image.
If NNEDI3 luma doubling is activated under these conditions (no chroma) the image turns green.

This is unaffected by the new chroma quality option.

EDIT: It's the combination of sigmoidal light upscaling, and linear light downscaling that seems to cause this.
Disabling either one fixes it.
Well done on finding the cause of the dark image bug. It was driving me crazy!
markanini is offline   Reply With Quote
Old 5th April 2016, 20:58   #37380  |  Link
Q-the-STORM
Registered User
 
Join Date: Sep 2012
Posts: 174
Quote:
Originally Posted by mysterix View Post
I use LAVFilter to decode video and it has it's own dithering options:So, as it seen, it has not "no dithering" mode. On the other side, MadVR has not "no dithering" mode too. Does it mean that dithering implements twice in this case? Should I use some other videofilter with MadVR instead of LAVFilter to avoid increased dithering artifacts and resource wasting?
LAV doesn't dither if you use madVR...
unless you do YUV -> RGB conversion... so leave that on untouched, then only madVR is dithering..
Q-the-STORM is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:22.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.