Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
5th May 2018, 17:40 | #50681 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
I'm pretty sure you can run two different instances of LAV also. Why not try your two different versions of madVR with your two different players and see what each shows as a version in the Windows notification area on your taskbar. Press Ctrl + S during playback from each player and see what version of madVR it displays.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W Last edited by brazen1; 5th May 2018 at 17:45. |
5th May 2018, 17:44 | #50682 | Link |
Registered User
Join Date: Nov 2013
Location: Stockton, CA
Posts: 21
|
Yes, they both show different versions, when playing from each player. And they both have their own settings.bin.
__________________
GTX1660Ti 6GB, H270M, i5-7500 3.4GHz, 16GB PC4-19200,Win10 Pro 64, JriverMC, Silicondust Tv Tuners, Denon AVRX2200, Sony XBR-Z9D |
5th May 2018, 18:02 | #50683 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Hmmmm? I usually dial back my madVR settings to accommodate all videos that fall within certain profiles. I don't want to tailor madVR or anything else based on the variances in titles other than what profiles already offer. For instance, one 1080p title can use some very high settings. Another 1080p needs the settings dialed back. There is nothing a profile would distinguish differently between the two titles. One is just more resource intensive than the other. Perhaps I could use one version of madVR with settings appropriate for one title using MPC-BE, and another version of madVR for the other more intensive titles using MPC-HC for example? I can call up as many different players in my front end (KODI) as I desire to add. Presently I'm using 5 different players default assigned where each shines best.
MPC players only allow to select which renderer, not versions of the same renderer. This could pose a problem. madVR versions would need to be added in front end software unless someone knows how to associate madVR versions with players. This could be a wild goose chase. Only want to use one front end. One example that would work appears to use ZoomPlayer with one madVR version and MPC with another. I may give this a try....
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W Last edited by brazen1; 5th May 2018 at 18:11. |
5th May 2018, 19:02 | #50684 | Link |
Registered User
Join Date: Jul 2016
Posts: 52
|
After installing a NVIDIA GTX 1080 and tired of the constant bugs of the various drivers, I reassembled my "old" AMD RX480 and the first thing I noticed that even with version 92.14 I had 10bit and not 8bit, so the "problem" it exists only for NVIDIA.
|
5th May 2018, 19:16 | #50685 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Yes, we know. Nvidia does not offer RGB 10bit. Never has. Only 8 and 12. Just use 8bit until maybe one day they decide to get around to it unless your display handles 12bit perfectly..... yawn.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
5th May 2018, 21:11 | #50686 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
|
Quote:
People are not noticing a change in image quality, they only notice the change in the number in the OSD. If madVR lied and said it was outputting 10 bit when it was really dithered 8 bit no one who is noticing this change now would have noticed. You have to do very specific tests with dithering disabled to even be able to tell if it is 8 or 10 bit. I really do not understand why so many people are so obsessed with 10 or 12 bit output. We have people using 4:2:2 chroma subsampling because they want 12 bit. I see why madshi didn't worry about 10 bit for so long, bit depth is massively overrated by the casual HTPC user. Then you have displays like mine, which offer better image quality using 8 bits, and people still want to send them 10 bit. Even I have to recheck this fact every once in a while. 10 bit just seems like it should be much better, but it really isn't.
__________________
madVR options explained Last edited by Asmodian; 5th May 2018 at 21:19. |
|
5th May 2018, 21:42 | #50688 | Link | ||
King of the Jungle
Join Date: Mar 2003
Location: Shoreditch, London
Posts: 429
|
Quote:
Quote:
It does make me wonder though about a lot of the current high end sets that say they are 10 bit, but are actually 8bit + FRC, which as I understand it is dithering at the TV end. In this scenario are we best off sending 8bit from MadVR or 10 bit and above? |
||
5th May 2018, 21:47 | #50689 | Link | |
Registered User
Join Date: Feb 2005
Posts: 38
|
Quote:
|
|
5th May 2018, 22:10 | #50690 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
I'm aware that many people should be using 8bits for various reasons, and using 4:2:2 just to get 10bits is clearly a bad idea, but if you can get 10bits 4:4:4, that's a bit less noise than with 8bits 4:4:4, so I don't see why this should go. Of course, most of you have displays that can't handle more than 8bits, or that could handle 10bits but not the 12bits that nVidia sends, so you keep harping about "no difference" between 8bits and 10bits, but that's simply not true if you have a 10bits or 12bits capable panel. And by the way, I don't know any "casual" HTPC user that uses MadVR. I'm certainly not a "casual" HTPC user, and neither are you or most of those posting in this thread. There is nothing better about 10bits regarding banding, MadVR's dithering is excellent, but there is simply a bit less noise than with 8bits.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 5th May 2018 at 22:15. |
|
5th May 2018, 23:19 | #50691 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
|
Oh, I agree. It would be better to have SDR modes still output 10 bit in fullscreen windowed, it was simply much worse to output 10 bit HDR in fullscreen windowed with an Nvidia GPU and this was a very quick fix.
madshi's fix was too simple but there are probably a lot more users using HDR passthrough than those that get a real benefit from 10 bit (sorry you are one of the few). Almost everyone with an HDR display is going to be using an HDR output mode because otherwise the display does not even enable its HDR mode. The only reason to use SDR output with HDR content is if you do not have an HDR display, or in your very special case with custom tone mapping that assumes HDR content in what the OS and drivers treat as SDR. They are also very likely to enable 10 bit because HDR displays support it and of course 10 is better. I have seen a lot of users with LG or Samsung HDR TVs and Nvidia GPUs in this thread. I am not saying your case is bad, or even pointless. We want the best quality possible at all times. madVR should enable 10 bit output for fullscreen windowed anytime it is not using Nvidia's HDR API. I was simply pointing out that 10 bit is subtle enough that most people can only notice it in the OSD, not in the image, even with true 10 bit panels. Comparing 6 bit dithering to 8 bit dithering is a good way for anyone to get an idea of the difference even without a proper 10 bit panel and 6 to 8 bit is a MUCH more significant difference compared to 8 to 10 bit. Seeing the noise due to static 8 bit ordered dithering done in linear light is... tricky, it does affect the image quality but it is very hard to see. If this does affect you negatively you have to use FSE or an older version of madVR for now.
__________________
madVR options explained Last edited by Asmodian; 5th May 2018 at 23:23. |
6th May 2018, 00:02 | #50692 | Link | |
Registered User
Join Date: Sep 2009
Location: Berlin
Posts: 173
|
Quote:
|
|
6th May 2018, 03:44 | #50693 | Link |
Registered User
Join Date: Aug 2015
Posts: 29
|
Hey guys, kind of dumb question that may have already been answered. But for bitmap subtitles played back from a DVD source, is there any way to skip upscaling of the subtitles and leave them untouched? They seem to get upscaled along with the image. Which kind of makes the text look pretty smudgy.
Just curious. |
6th May 2018, 04:27 | #50694 | Link | |
Registered User
Join Date: Aug 2005
Posts: 54
|
Quote:
VobSub had a cool option back in the day to "vectorize" bitmap subtitles. It sometimes broke but when it worked it was so good. I don't think anything offers that anymore. |
|
6th May 2018, 04:52 | #50695 | Link |
Registered User
Join Date: Nov 2016
Posts: 181
|
someone has problems with Windows 10 1803 update? Something strange happens after the update on my laptop, the render times droped from 16 ms to 10 or 12 ms (2k upscaling to 4k), I know, it's a good thing, but it seems very strange to me. Usually after an update of MS, things never get better ... I checked all the Madvr settings and they are the same as yesterday ...
Example 4k HDR window modem downscaling BEFORE UPDATE AFTER UPDATE
__________________
"To infinity, and beyond!" Last edited by Oguignant; 6th May 2018 at 05:14. |
6th May 2018, 06:07 | #50698 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
|
I get lower rendering times reported too but I cannot push my settings higher. Now I start getting dropped frames at rendering times much lower than the frame time. I think this a change in the rendering time's measurement, not a change in performance.
This is entirely dependent on the display. If is has a good internal pathway that preserves madVR's dithering in 10 bit until the panel, and its FRC algorithm is good, it can be better to send it 10 bit. Most of my displays have not been ulta high end and have not done 10 bit better than 8 bit. I did have a 10 bit monitor that had a very good 10 bit pathway, at least as far as I could determine in testing, and it was an 8 bit + FRC panel.
__________________
madVR options explained Last edited by Asmodian; 6th May 2018 at 09:05. |
6th May 2018, 06:57 | #50699 | Link | |
Registered User
Join Date: Aug 2015
Posts: 29
|
Quote:
|
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|