Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 27th July 2016, 05:32   #38921  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,292
Quote:
Originally Posted by 70MM View Post
Ok nice to know, I will just leave it at 128 and I really cant see an improvement on 256....Many thanks for the correction NNEDI3
Chroma isn't that big of a deal, personally I can't imagine ever using anything about 32 or 64 neurons if I choose to use NNEDI3. I feel super-XBR is a far better option. Why not use a few videos for comparison to evaluate the difference?

I did a test on my GTX 960 on a full HD video and it took my rendering time up from 11ms to 28ms.. if you're going to used NNEDI3 for chroma then consider 64 neurons, that was like 16ms here. I doubt you'd notice the difference between 64 and 128 neurons. If you're not using this performance for anything else... sure, but it's excessive and you wouldn't find me using it, even if I had a Titan X!

As always I would recommend users always evaluate the difference in picture before changing options, especially the high cost options. NNEDI3's sweet spot is 64 neurons as far as cost to upscaling improvements is concerned. I feel anything higher for chroma really is just wasting power. If you can't visually see a difference then there's no point in changing it if it costs more performance, more heat, more noise, more on your bill. Doesn't make much sense.
ryrynz is offline   Reply With Quote
Old 27th July 2016, 06:05   #38922  |  Link
70MM
X Cinema Projectionist NZ
 
Join Date: Feb 2006
Location: Auckland NZ
Posts: 281
Quote:
Originally Posted by ryrynz View Post
Chroma isn't that big of a deal, personally I can't imagine ever using anything about 32 or 64 neurons if I choose to use NNEDI3. I feel super-XBR is a far better option. Why not use a few videos for comparison to evaluate the difference?

I did a test on my GTX 960 on a full HD video and it took my rendering time up from 11ms to 28ms.. if you're going to used NNEDI3 for chroma then consider 64 neurons, that was like 16ms here. I doubt you'd notice the difference between 64 and 128 neurons. If you're not using this performance for anything else... sure, but it's excessive and you wouldn't find me using it, even if I had a Titan X!

As always I would recommend users always evaluate the difference in picture before changing options, especially the high cost options. NNEDI3's sweet spot is 64 neurons as far as cost to upscaling improvements is concerned. I feel anything higher for chroma really is just wasting power. If you can't visually see a difference then there's no point in changing it if it costs more performance, more heat, more noise, more on your bill. Doesn't make much sense.
Oh Gee!
I thought using the higher 128 neurons with NNEDI3 was going to be the icing on the cake, but it looks like its not really worth it. I did know that Luma is much more important than Chroma and I have noticed that the rendering does go up, yet I thought since I had the 1080 card I should be using the highest settings....But you are right, my eyes don't seem to see anything better with the higher chroma settings

I wish there was some Luma settings that I could use to bump that side up since they say that Luma has the more important.

I will drop the Chroma upscaling back to 64 neurons and that too will really lower the render times....

My screen is pretty large, 120" dia 16:9 and 146" wide scope. I have to take care with some of the settings as if they are bumped up too high it really shows up artefacts!

Im learning stuff from this thread and many of you experts have been very helpful.

BTW all my watching is mainly BD rips, don't game or show much animated progs. No TV or anything else like that....

However I have stopped using Image Doubling after many here said its not really worth it on 1080 > 1080 BD rips....
70MM is offline   Reply With Quote
Old 27th July 2016, 11:32   #38923  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,595
Quote:
Originally Posted by ionutm80 View Post
The 25Hz rezolution is not supported by TV and so these movies are displayed at 50 Hz. madVR stats are reporting 1 frame drop every 8-9 hrs or even 1.2 days (i have very light up-scaling options) but shows a huge number of repeated frames and the counter does not stop.
That's strange, it shouldn't work like this.

Quote:
Originally Posted by ionutm80 View Post
Is it because the 25 fps movie in order to be displayed on 50 Hz screen the number of frames need to be doubled?
No. There's no doubling is used unless: do you have Smooth Motion enabled? If so, then disable it, you don't need it at all if display refresh rate is really close to the origin fps.

Quote:
Originally Posted by ionutm80 View Post
Is this something normal or am I doing something wrong here?
No, it's not normal, and you do everything well.
This is how I use 29.97/30 fps content with 60Hz (in PC mode: it treats chroma differently).
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 27th July 2016, 12:45   #38924  |  Link
Sunset1982
Registered User
 
Join Date: Sep 2014
Posts: 278
Quote:
Originally Posted by ShiftyFella View Post
LumaSharpen is good sharper and I prefer it over AdaptiveSharpen but saying that I'm not big fan of LumaSharpen as using low values does nothing for me and with bigger values I noticed it sharpens noise as well, so I prefer to use Crispen Edges with Enhance detail instead as it achieves better result for my taste without really amplifying noise\film grain
Ok, thank you for your explanation. What values are you using with Crispen Edges and Enhance Detail?
__________________
Intel i5 6600, 16 GB DDR4, AMD Vega RX56 8 GB, Windows 10 x64, Kodi DS Player 17.6, MadVR (x64), LAV Filters (x64), XySubfilter .746 (x64)
LG 4K OLED (65C8D), Denon X-4200 AVR, Dali Zensor 5.1 Set
Sunset1982 is offline   Reply With Quote
Old 27th July 2016, 12:54   #38925  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Hello,

Is there any guide to what the Intel HD 530 internal graphics can handle? I'd like to see if I can get better picture out of madvr. Have never really played with the settings.
robl45 is offline   Reply With Quote
Old 27th July 2016, 13:00   #38926  |  Link
AntonP
Registered User
 
Join Date: Jul 2016
Posts: 13
Quote:
Originally Posted by robl45 View Post
You can't, you get the 3d refresh rate that you get and for nvidia that is 23.971 or so. Reclock certainly works fine with this, I have used it for years, but you aren't going to get bitstreaming like that so I have switched to using the internal intel graphics. Heck of a waste on a GTX950, but oh well.

At any rate. Reclock will take and speed up the video to 24 fps and match the audio. You should set your refresh at the nvidia 23hz setting which is 23.971 and let reclock do its thing.

I cant agree with you. For perfect playback with no dropped or repeated frames monitor's refresh rate (from nvidia) and movie's fps should be exactly the same.
Reclock makes 24fps movie from 23.976 and with 24Hz (madvr 1080p24 mode) displays refresh rate the picture is perfect! With 1080p23 mode (23.971 from nvidia) there are dropped frames. That's all about 2D. In 3D nvidia panel or windows display properties has both 23p and 24p modes, but madvr doesnt let me choose! There is not such an option in display modes smth like "1080p3D24p" ((




PS Reclock works with bitstream audio with now problems!
AntonP is offline   Reply With Quote
Old 27th July 2016, 13:03   #38927  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
You don't understand how reclock works. Your refresh rate has to be the same as the movie for perfect playback. AFAIK, only intel can do perfect 23.976. With nvidia, you get 24hz or 23.971. Reclock will speed the move up to 24 and match the audio. 23.971 or 24hz makes no difference, you still are not matching the video and reclock will do the work. In the madvr display, you will see one frame drop or repeat like once every 8 hours or more if reclock is doing its job.


Quote:
Originally Posted by AntonP View Post
I cant agree with you. For perfect playback with no dropped or repeated frames monitor's refresh rate (from nvidia) and movie's fps should be exactly the same.
Reclock makes 24fps movie from 23.976 and with 24Hz (madvr 1080p24 mode) displays refresh rate the picture is perfect! With 1080p23 mode (23.971 from nvidia) there are dropped frames. That's all about 2D. In 3D nvidia panel or windows display properties has both 23p and 24p modes, but madvr doesnt let me choose! There is not such an option in display modes smth like "1080p3D24p" ((




PS Reclock works with bitstream audio with now problems!
robl45 is offline   Reply With Quote
Old 27th July 2016, 13:15   #38928  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,292
Quote:
Originally Posted by robl45 View Post
Hello,

Is there any guide to what the Intel HD 530 internal graphics can handle? I'd like to see if I can get better picture out of madvr. Have never really played with the settings.
No, you'll need to test that for yourself "playing with the settings" is kind of a requirement to get the best (in your opinion) from it.
ryrynz is offline   Reply With Quote
Old 27th July 2016, 13:18   #38929  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,475
Quote:
Originally Posted by 70MM View Post
I'm currently using Error Diffusion Option 2 with the two above ticked...

Is it cleaner without using them?

I only ticked them as a tutorial on the web said to tick both
Did you miss the parts that say "subjectively increases noise" and "increases chroma noise"?

Tutorials are nice and all but we all run different rigs, got different visions(8% of men are colorblind at different levels) and personal tastes.
leeperry is offline   Reply With Quote
Old 27th July 2016, 14:06   #38930  |  Link
AntonP
Registered User
 
Join Date: Jul 2016
Posts: 13
Quote:
Originally Posted by robl45 View Post
You don't understand how reclock works. Your refresh rate has to be the same as the movie for perfect playback. AFAIK, only intel can do perfect 23.976. With nvidia, you get 24hz or 23.971. Reclock will speed the move up to 24 and match the audio. 23.971 or 24hz makes no difference, you still are not matching the video and reclock will do the work. In the madvr display, you will see one frame drop or repeat like once every 8 hours or more if reclock is doing its job.
I understand it )) afaik both ATI and intel (new models) do correct 23.976.

The only difference with 23.971 and 24 display's refresh rates is that with 24Hz + reclocked 23.976 video i have no visiable drops (one in 20-40 hours is not a problem for me).
I want to use the same trick in 3D, but i cant do this becouse of absense of 24 (48)Hz option.
AntonP is offline   Reply With Quote
Old 27th July 2016, 14:15   #38931  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Quote:
Originally Posted by ryrynz View Post
No, you'll need to test that for yourself "playing with the settings" is kind of a requirement to get the best (in your opinion) from it.
Yes, but how do i know what the HD 530 graphics can handle? Is it just basically if the CPU gets maxed out?
robl45 is offline   Reply With Quote
Old 27th July 2016, 14:20   #38932  |  Link
strumf666
Registered User
 
Join Date: Jan 2012
Posts: 94
You mean the GPU?
Usually at GPU load ~70% and above dropped frames start occurring; it's quite visible to the eye, but you can check the OSD (ctrl+j) for dropped frames counter.

Last edited by strumf666; 27th July 2016 at 14:24.
strumf666 is offline   Reply With Quote
Old 27th July 2016, 14:21   #38933  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Quote:
Originally Posted by AntonP View Post
I understand it )) afaik both ATI and intel (new models) do correct 23.976.

The only difference with 23.971 and 24 display's refresh rates is that with 24Hz + reclocked 23.976 video i have no visiable drops (one in 20-40 hours is not a problem for me).
I want to use the same trick in 3D, but i cant do this becouse of absense of 24 (48)Hz option.
You would have no visible drops with 23.971 with reclock either. I've been using it this way for years. You may have to clean out the timings in reclock config so it can learn the 23.971 refresh rate. As far as I know for 3d, you get 23.971 with Nvidia. I tried making a custom refresh rate with nvidia and the 3d got amazingly messed up. I gave up on the Nvidia at this point as I'd like to be able to try DTSX and Atmos and they need to bitstream with 23.976.
robl45 is offline   Reply With Quote
Old 27th July 2016, 14:22   #38934  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Quote:
Originally Posted by strumf666 View Post
You mean the GPU?
How would I know if the GPU gets maxxed out? I assume the CPU will start maxxing out if the GPU can't handle it? Sorry, I've never really played with this much even though I've been using madvr for years. I set it to some low settings as my old GT430 wasn't very powerful.
robl45 is offline   Reply With Quote
Old 27th July 2016, 14:28   #38935  |  Link
strumf666
Registered User
 
Join Date: Jan 2012
Posts: 94
I edited my first reply for more info&clarity. You can use a program like gpuz for gpu load or any other (there are many), but you don't actually need that, because massive frame drops are easily visible, for minor you can use the OSD (ctrl+j) which has a dropped frames counter.
Your assumption isn't entirely correct, because CPU and GPU aren't fully interchagable in regards to the work madVR requires and even then the GPU is usually much more powerful.

Last edited by strumf666; 27th July 2016 at 14:32.
strumf666 is offline   Reply With Quote
Old 27th July 2016, 18:22   #38936  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by AntonP View Post
I cant agree with you. For perfect playback with no dropped or repeated frames monitor's refresh rate (from nvidia) and movie's fps should be exactly the same.
Reclock makes 24fps movie from 23.976 and with 24Hz (madvr 1080p24 mode) displays refresh rate the picture is perfect! With 1080p23 mode (23.971 from nvidia) there are dropped frames. That's all about 2D. In 3D nvidia panel or windows display properties has both 23p and 24p modes, but madvr doesnt let me choose! There is not such an option in display modes smth like "1080p3D24p" ((




PS Reclock works with bitstream audio with now problems!
I believe ReClock works on the audio alone, speeding it up or slowing it down to match the display refresh rate. madVR can change the speed of a 25 fps video to 24 fps. ReClock would then slow down the audio by the same amount.
Warner306 is offline   Reply With Quote
Old 27th July 2016, 18:33   #38937  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Quote:
Originally Posted by Warner306 View Post
I believe ReClock works on the audio alone, speeding it up or slowing it down to match the display refresh rate. madVR can change the speed of a 25 fps video to 24 fps. ReClock would then slow down the audio by the same amount.
No reclock changes the video speed and then resamples the audio to match. Reclock by itself will slowdown 25 to 24p, I've used that feature many times. i'm not sure exactly what madvr is doing with that. I suppose that function is if you use madvr to automatically select the refresh rate.
robl45 is offline   Reply With Quote
Old 27th July 2016, 18:38   #38938  |  Link
Stereodude
Registered User
 
Join Date: Dec 2002
Location: Region 0
Posts: 1,257
Quote:
Originally Posted by Sunset1982 View Post
Ok, last post on the gtx1060 / rx480 topic, let's get back to madvr: (you can write me some pm if you got questions)

here is my upgraded comparison:

http://www.file-upload.net/download-...ison.xlsx.html
So, is Pascal better in madVR from Maxwell or did the RX 480 take a step back from the previous AMD hardware? Or, maybe both?

In the past an AMD card that got beat by a given Nvidia card in games could equal or beat it with madVR. Like a GTX 960 had about 10% better render times in madVR than a R7 260x, but the GTX 960 would be like 50% ahead in games compared to the R7 260x. The AMD cards tended to have a lot more shader/compute power for the price that didn't seem to get reflected in game performance. Now, the RX 480 and the GTX 1060 are effectively tied in madVR (averaging your results for all test cases), and their gaming performance is fairly comparable with DX11 leaning GTX 1060 and DX12/Vulkan leaning RX 480.

The GTX 960 had about 10% more FLOPS than the R7 260x and was about 10% faster in madVR. But, the RX 480 has 50% more FLOPS than the GTX 1060, but isn't any faster at madVR?

I was hoping to buy a RX 480 to upgrade my HTPC from an R9 380, but I'm not sure it's really much faster.

Last edited by Stereodude; 27th July 2016 at 18:40.
Stereodude is offline   Reply With Quote
Old 27th July 2016, 19:39   #38939  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,801
Quote:
Originally Posted by 70MM View Post
Are you mentioning here not to have ticked "Use coloured noise" and "change dither for every frame"

Im currently using Error Diffusion Option 2 with the two above ticked...

Is it cleaner without using them?

I only ticked them as a tutorial on the web said to tick both
I also much prefer them both disabled but that is on a very different screen, 27" 2560x1440 or a 1080p Plasma, so you will have to judge for yourself. I quite dislike chroma noise though.

I agree with ryrynz too, max NNEDI3 64 for chroma. I actually use Reconstruction soft + SR1 for chroma, or sometimes NNEDI3 32.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 27th July 2016, 22:29   #38940  |  Link
robl45
Registered User
 
Join Date: Dec 2012
Posts: 157
Quote:
Originally Posted by strumf666 View Post
I edited my first reply for more info&clarity. You can use a program like gpuz for gpu load or any other (there are many), but you don't actually need that, because massive frame drops are easily visible, for minor you can use the OSD (ctrl+j) which has a dropped frames counter.
Your assumption isn't entirely correct, because CPU and GPU aren't fully interchagable in regards to the work madVR requires and even then the GPU is usually much more powerful.
Yes, this is the problem. I found a couple of pages that show settings and I tried those and the GPU was just maxed at 100% and the video wouldn't play. So I reset to default. Thats why I was asking what are recommended settings for Intel HD 530.
robl45 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:54.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2020, vBulletin Solutions Inc.