Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
2nd March 2019, 18:39 | #55101 | Link | |
Registered User
Join Date: May 2004
Posts: 5,351
|
Quote:
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
|
2nd March 2019, 19:01 | #55103 | Link | |
Registered User
Join Date: May 2013
Posts: 706
|
Quote:
Yea, that matches my performance stats. These are the chroma difference images I took I really don't think breaking a sweat over NGU chroma is worth it. Not against buying m0ar GPU, but chroma wouldn't be the big ticket reason for it. Nearest neighbor vs NGU Very High https://imgur.com/b7INySV NGU low vs NGU Very High https://imgur.com/wF5QuB7
__________________
Ghetto | 2500k 5Ghz |
|
2nd March 2019, 20:14 | #55105 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
With D3D11 copyback, with the live algo latest test build, I get around 35ms rendering. This goes down to 25ms with D3D11 native, but I need copyback for black bars detection and UHD Bluray menus with jRiver. That's with 16/9 content (Pacific Rim) as a worst case scenario, so it would be less taxing in wide screen. I can run NGU chroma AA high in native, but frankly NGU med is fine. By the way, we have the same ancient CPU
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
|
2nd March 2019, 23:22 | #55110 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
No. They are test settings. They won't make it into the final build. They are for figuring out what the proper settings are for each option.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
3rd March 2019, 00:35 | #55112 | Link |
Registered User
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
|
Has anyone used the "HDR" boolean with profiles? I'm trying to have a profile group under Devices -> Display called "Bit Depth", using two profiles (HDR, SDR) to assign Properties -> Display Bit Depth depending on what's being played. "Auto" for HDR, and 8 bit for SDR. It's as simple as can be:
if (HDR) "HDR" else "SDR" Problem is, madVR will always use the SDR profile, and therefore won't switch the TV into HDR mode. As some you might know, for a while now, setting the native display bit depth in madVR to anything other than 10 bit or Auto won't trigger HDR. At least with Nvidia cards. What I'm trying to achieve here is to only have HDR content use 10 bit or Auto, and SDR use 8 bit. All this is with the video card output set to 8 bit RGB full, of course.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex |
3rd March 2019, 00:48 | #55113 | Link |
Registered User
Join Date: Jun 2018
Posts: 51
|
Fullscreen Window 8bit?
Hey guys,
When I enable "use fullscreen window" in Kodi DSPlayer and press ctrl+j to get the madvr osd, I see D3D11 full screen windowed (8bit) but when I don't select "use fullscreen window" it says D3D11 exclusive (10bit). Can I please kindly ask what the differences are with this? Does this mean that my video is outputting in 8bit rather than 10bit? If so then I guess this is not good as I use tone mapping for my 4K projector with Kodi DSPLAYER and madVR and most my movie viewing are 10bit 4K HDR movies.. Can someone please explain this to me and how it works? The reason I use the "use fullscreen window option2 is because my windows desktop is set to 3840 x 1620p as is my Kodi DSPLAYER resolution so that my Kodi GUI can remain as scope within my 2.39:1 screen. My movies are not tampered with as they play 3840 x 2160p as can be seen in the attached pictures but just wondered why it now says 8bit where as before always used to say 10bit? Is there a way to force it too always play 10bit when a 4K movie or HDR movie playing? Thank you.. Please see attached images for reference.. Many thanks.. Last edited by mkohman; 3rd March 2019 at 00:50. |
3rd March 2019, 00:49 | #55114 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
If you really want/need to use 10bits output, then you need to set the driver to 12bits, which causes its own range of issues.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
|
3rd March 2019, 01:04 | #55115 | Link | |
Registered User
Join Date: Mar 2009
Posts: 3,646
|
Quote:
See how it looks to you. |
|
3rd March 2019, 01:11 | #55116 | Link | |
Registered User
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
|
Quote:
I don't see any banding when I use 10-bit in madVR with my 8-bit Nvidia output, but I know it's not ideal. I do want everything to be 8-bit, but when madVR is set to 8-bit, the display won't switch into HDR mode. This used to work a while ago. Not sure if newer versions of madVR or Nvidia drivers are to blame. That's why I'd like to use the HDR boolean, but it doesn't seem to work properly.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex |
|
3rd March 2019, 01:24 | #55117 | Link | |
Registered User
Join Date: May 2013
Posts: 706
|
Quote:
__________________
Ghetto | 2500k 5Ghz |
|
3rd March 2019, 01:29 | #55119 | Link |
Registered User
Join Date: May 2004
Posts: 5,351
|
Lots of things. They're working on figuring out how best to identify scene changes for dynamic hdr. I doubt madshi has put much effort into optimizing it yet. They're just seeing what looks good right now. That's the trade-off for using test builds. It's also why I disable the dynamic hdr code and just use measurements.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED |
3rd March 2019, 03:20 | #55120 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
i tested this countless times in the past. the end device has nothing todo with this if the GPU driver respect the 8 bit setting the banding has to be created at the GPU side. @mkohman upload the images somewhere else plz and a link to them is enough. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|